author: Nikolay Chehlarov
email: chehlarow@yahoo.com
git: https://github.com/Chehlarov
date: 17.02.2022.
This work describes the development of an algorithm for image segmentation. The goal is to detect Pt grain borders from SEM images. Four U-nets are implemented and evaluated. The best U-net type for the task proved to be Spatial Attention U-Net. Hyper parameter tuning is performed to find optimal settings for the model.
Pt films are used in engineering for various purposes. For example for temperature or strain sensing. The distribution of grain sizes is an important property for the product function. Traditional based approaches failed to achieve acceptable results. ML algorithm was developed with good performance (https://github.com/Chehlarov/Machine-Learning/tree/main/00%20-%20project). In production environment photos with different noise level and scale have to be analyzed. The ML approach did not deliver the expected generalization and production readiness. Deep learning approach is expected to meet the production needs.
Create a model to perform image segmentation of Pt grains from a SEM image. Pixels should be classified as border or not border.
The task is semantic segmentation, several suitable architecture exist. The difference between grain border segmentation and typical segmentation task is that grain borders are continues curves with small width. Similar standard task is Retinal Vessel Segmentation . Looking into the top performing models, one could easily notice that several models are based on U-net architecture. U-net is also the most implemented paper. U-net types can summarized to:
U-net is supposed to perform well even on small dataset.The available labeled dataset for border segmentation is small and U-net might be suitable solution. A couple of the U-net versions will be implemented and compared:
Highlights:
SA U-net architecture
attention module

Highlights:
U-net architecture

Highlights:
Attention U-net architecture

Attention gate

Highlights:
residual block

The implementation is based on several Github repositories. There are modifications to adapt to the current task. The code is based on tensorflow 2.
import tensorflow as tf
from tensorflow.keras import models, layers, regularizers
from tensorflow.keras import backend as K
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.preprocessing.image import ImageDataGenerator
import keras_tuner as kt
import tensorflow_addons as tfa
from focal_loss import BinaryFocalLoss
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.animation as animation
from skimage import io, color, measure
from skimage.util import img_as_float32, img_as_ubyte
from skimage import filters
from skimage.feature import peak_local_max
from skimage.morphology import binary_dilation, flood_fill #skeletonize, disk, binary_closing,
from skimage.segmentation import clear_border, watershed
from scipy import ndimage
import pandas as pd
import cv2
import time
import os
# import seaborn as sns
This section contains the main building blocks for the U-net models.
# A few metrics and losses
def dice_coef(y_true, y_pred):
"""
Basically F1 score; harmonic mean between precition and recall
https://www.youtube.com/watch?v=AZr64OxshLo&t=15s
better for optimization (loss function) because it tries to balance precition and recall
"""
y_true_f = K.flatten(y_true)
y_true_f = tf.cast(y_true_f, tf.float32)
y_pred_f = K.flatten(y_pred)
intersection = K.sum(y_true_f * y_pred_f)
return (2.0 * intersection + 1.0) / (K.sum(y_true_f) + K.sum(y_pred_f) + 1.0)
def jacard_coef(y_true, y_pred):
"""
bassically IoU
https://www.youtube.com/watch?v=AZr64OxshLo&t=15s
better as metric because it is easy to interprit
"""
y_true_f = K.flatten(y_true)
y_true_f = tf.cast(y_true_f, tf.float32)
y_pred_f = K.flatten(y_pred)
intersection = K.sum(y_true_f * y_pred_f)
return (intersection + 1.0) / (K.sum(y_true_f) + K.sum(y_pred_f) - intersection + 1.0)
# not used
# def jacard_coef_loss(y_true, y_pred):
# return 1 - jacard_coef(y_true, y_pred)
def dice_coef_loss(y_true, y_pred):
return 1 - dice_coef(y_true, y_pred)
def mae_euclidean(y_true, y_pred):
"""
Mean square error of euclean distance between the predicted distance map and the true distance map
This metric is not very stable, one false positive pixel inside a grain can change the metric significantly.
The metric has bias towards the grain size distribution - fine grain structures will have lower
euclidean distances and hence smaller MSE values.
this function is not differentiable and cannot be used as loss function.
"""
y_true_inv = (y_true == 0)
y_true_inv = tf.cast(y_true_inv, tf.uint8)
y_true_distance_map = tfa.image.euclidean_dist_transform(y_true_inv)
y_pred_inv = (y_pred < 0.5)
y_pred_inv = tf.cast(y_pred_inv, tf.uint8)
y_pred_distance_map = tfa.image.euclidean_dist_transform(y_pred_inv)
# res = tf.math.squared_difference(y_pred_distance_map, y_true_distance_map) # squared error leads to high penaulty for misclasification
res = tf.math.abs(tf.math.subtract(y_pred_distance_map, y_true_distance_map))
res = tf.keras.backend.mean(res)
return res
#unit test
y_true = np.array([0, 0, 1, 0, 1, 0, 1])
y_pred = np.array([1., 1, 0, 0, 1, 0, 1])
y_true = tf.expand_dims(y_true, -1)
y_pred = tf.expand_dims(y_pred, -1)
assert(np.abs(mae_euclidean(y_true, y_pred).numpy() - 0.5714286) < 0.0001)
# main blocks of the U-nets
class DropBlock2D(layers.Layer):
"""
See: https://arxiv.org/pdf/1810.12890.pdf
Used in SA-UNet
"""
def _normalize_data_format(self, value):
# https://stackoverflow.com/questions/53442190/importerror-cannot-import-name-normalize-data-format
if value is None:
value = K.image_data_format()
data_format = value.lower()
if data_format not in {'channels_first', 'channels_last'}:
raise ValueError('The `data_format` argument must be one of '
'"channels_first", "channels_last". Received: ' +
str(value))
return data_format
def __init__(self,
block_size,
keep_prob,
sync_channels=False,
data_format=None,
**kwargs):
"""Initialize the layer.
:param block_size: Size for each mask block.
:param keep_prob: Probability of keeping the original feature.
:param sync_channels: Whether to use the same dropout for all channels.
:param data_format: 'channels_first' or 'channels_last' (default).
:param kwargs: Arguments for parent class.
"""
super(DropBlock2D, self).__init__(**kwargs)
self.block_size = block_size
self.keep_prob = keep_prob
self.sync_channels = sync_channels
self.data_format = self._normalize_data_format(data_format)
self.input_spec = tf.keras.layers.InputSpec(ndim=4)
self.supports_masking = True
def get_config(self):
config = {'block_size': self.block_size,
'keep_prob': self.keep_prob,
'sync_channels': self.sync_channels,
'data_format': self.data_format}
base_config = super(DropBlock2D, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
def compute_mask(self, inputs, mask=None):
return mask
def compute_output_shape(self, input_shape):
return input_shape
def _get_gamma(self, height, width):
"""Get the number of activation units to drop"""
height, width = K.cast(height, K.floatx()), K.cast(width, K.floatx())
block_size = K.constant(self.block_size, dtype=K.floatx())
return ((1.0 - self.keep_prob) / (block_size ** 2)) *\
(height * width / ((height - block_size + 1.0) * (width - block_size + 1.0)))
def _compute_valid_seed_region(self, height, width):
positions = K.concatenate([
tf.expand_dims(K.tile(K.expand_dims(tf.range(height), axis=1), [1, width]), axis=-1),
tf.expand_dims(K.tile(K.expand_dims(tf.range(width), axis=0), [height, 1]), axis=-1),
], axis=-1)
half_block_size = self.block_size // 2
valid_seed_region = K.switch(
K.all(
K.stack(
[
positions[:, :, 0] >= half_block_size,
positions[:, :, 1] >= half_block_size,
positions[:, :, 0] < height - half_block_size,
positions[:, :, 1] < width - half_block_size,
],
axis=-1,
),
axis=-1,
),
tf.ones((height, width)),
tf.zeros((height, width)),
)
return tf.expand_dims(tf.expand_dims(valid_seed_region, axis=0), axis=-1)
def _compute_drop_mask(self, shape):
height, width = shape[1], shape[2]
mask = tf.keras.backend.random_bernoulli(shape, p=self._get_gamma(height, width))
mask *= self._compute_valid_seed_region(height, width)
mask = tf.keras.layers.MaxPool2D(
pool_size=(self.block_size, self.block_size),
padding='same',
strides=1,
data_format='channels_last',
)(mask)
return 1.0 - mask
def call(self, inputs, training=None):
def dropped_inputs():
outputs = inputs
if self.data_format == 'channels_first':
outputs = K.permute_dimensions(outputs, [0, 2, 3, 1])
shape = K.shape(outputs)
if self.sync_channels:
mask = self._compute_drop_mask([shape[0], shape[1], shape[2], 1])
else:
mask = self._compute_drop_mask(shape)
outputs = outputs * mask *\
(K.cast(K.prod(shape), dtype=K.floatx()) / K.sum(mask))
if self.data_format == 'channels_first':
outputs = K.permute_dimensions(outputs, [0, 3, 1, 2])
return outputs
return K.in_train_phase(dropped_inputs, inputs, training=training)
def spatial_attention(input_feature):
"""
Used in the bottleneck of SA-UNet
"""
kernel_size = 7
if K.image_data_format() == "channels_first":
channel = input_feature._keras_shape[1]
cbam_feature = Permute((2, 3, 1))(input_feature)
else:
# channel = input_feature._keras_shape[-1]
channel = input_feature.shape[-1]
cbam_feature = input_feature
avg_pool = layers.Lambda(lambda x: K.mean(x, axis=3, keepdims=True))(cbam_feature)
assert avg_pool.shape[-1] == 1 # assert avg_pool._keras_shape[-1] == 1
max_pool = layers.Lambda(lambda x: K.max(x, axis=3, keepdims=True))(cbam_feature)
assert max_pool.shape[-1] == 1 # assert max_pool._keras_shape[-1] == 1
concat = layers.Concatenate(axis=3)([avg_pool, max_pool])
assert max_pool.shape[-1] == 1 # assert concat._keras_shape[-1] == 2
cbam_feature = layers.Conv2D(filters=1,
kernel_size=kernel_size,
strides=1,
padding='same',
activation='sigmoid',
kernel_initializer='he_normal',
use_bias=False)(concat)
assert cbam_feature.shape[-1] == 1 # assert cbam_feature._keras_shape[-1] == 1
if K.image_data_format() == "channels_first":
cbam_feature = layers.Permute((3, 1, 2))(cbam_feature)
return layers.multiply([input_feature, cbam_feature])
def conv_block(x, filter_size, size, dropout, batch_norm=False):
"""
Convolution block of 2 Conv2D
"""
conv = layers.Conv2D(size, (filter_size, filter_size), padding="same")(x)
if batch_norm is True:
conv = layers.BatchNormalization(axis=3)(conv)
conv = layers.Activation("relu")(conv)
conv = layers.Conv2D(size, (filter_size, filter_size), padding="same")(conv)
if batch_norm is True:
conv = layers.BatchNormalization(axis=3)(conv)
conv = layers.Activation("relu")(conv)
if dropout > 0:
conv = layers.Dropout(dropout)(conv)
return conv
def repeat_elem(tensor, rep):
"""
lambda function to repeat Repeats the elements of a tensor along an axis
by a factor of rep.
If tensor has shape (None, 256,256,3), lambda will return a tensor of shape
(None, 256,256,6), if specified axis=3 and rep=2.
"""
return layers.Lambda(lambda x, repnum: K.repeat_elements(x, repnum, axis=3),
arguments={'repnum': rep})(tensor)
def res_conv_block(x, filter_size, size, dropout, batch_norm=False):
"""
Residual convolutional block.
conv - BN - Activation - conv - BN - dropout - shortcut+BN - shortcut+BN - Activation
Check fig 4 in https://arxiv.org/ftp/arxiv/papers/1802/1802.06955.pdf
"""
conv = layers.Conv2D(size, (filter_size, filter_size), padding='same')(x)
if batch_norm is True:
conv = layers.BatchNormalization(axis=3)(conv)
conv = layers.Activation('relu')(conv)
conv = layers.Conv2D(size, (filter_size, filter_size), padding='same')(conv)
if batch_norm is True:
conv = layers.BatchNormalization(axis=3)(conv)
#conv = layers.Activation('relu')(conv) #Activation before addition with shortcut
if dropout > 0:
conv = layers.Dropout(dropout)(conv)
shortcut = layers.Conv2D(size, kernel_size=(1, 1), padding='same')(x)
if batch_norm is True:
shortcut = layers.BatchNormalization(axis=3)(shortcut)
res_path = layers.add([shortcut, conv])
res_path = layers.Activation('relu')(res_path) #Activation after addition with shortcut (Original residual block)
return res_path
def gating_signal(input, out_size, batch_norm=False):
"""
resize the down layer feature map into the same dimension as the up layer feature map
using 1x1 conv
:return: the gating feature map with the same dimension of the up layer feature map
"""
x = layers.Conv2D(out_size, (1, 1), padding='same')(input)
if batch_norm:
x = layers.BatchNormalization()(x)
x = layers.Activation('relu')(x)
return x
def attention_block(x, gating, inter_shape):
"""
Attention gate;
see https://arxiv.org/pdf/1804.03999.pdf fig 2
"""
shape_x = K.int_shape(x)
shape_g = K.int_shape(gating)
# Getting the x signal to the same shape as the gating signal
theta_x = layers.Conv2D(inter_shape, (2, 2), strides=(2, 2), padding='same')(x) # 16
shape_theta_x = K.int_shape(theta_x)
# Getting the gating signal to the same number of filters as the inter_shape
phi_g = layers.Conv2D(inter_shape, (1, 1), padding='same')(gating)
upsample_g = layers.Conv2DTranspose(inter_shape, (3, 3),
strides=(shape_theta_x[1] // shape_g[1], shape_theta_x[2] // shape_g[2]),
padding='same')(phi_g) # 16
concat_xg = layers.add([upsample_g, theta_x])
act_xg = layers.Activation('relu')(concat_xg)
psi = layers.Conv2D(1, (1, 1), padding='same')(act_xg)
sigmoid_xg = layers.Activation('sigmoid')(psi)
shape_sigmoid = K.int_shape(sigmoid_xg)
upsample_psi = layers.UpSampling2D(size=(shape_x[1] // shape_sigmoid[1], shape_x[2] // shape_sigmoid[2]))(sigmoid_xg) # 32
upsample_psi = repeat_elem(upsample_psi, shape_x[3])
y = layers.multiply([upsample_psi, x])
result = layers.Conv2D(shape_x[3], (1, 1), padding='same')(y)
result_bn = layers.BatchNormalization()(result)
return result_bn
# models
def UNet(input_shape, NUM_CLASSES=1, dropout_rate=0.0, batch_norm=True):
"""
UNet
"""
# network structure
FILTER_NUM = 64 # number of filters for the first layer
FILTER_SIZE = 3 # size of the convolutional filter
UP_SAMP_SIZE = 2 # size of upsampling filters
inputs = layers.Input(input_shape, dtype=tf.float32)
# Downsampling layers
# DownRes 1, convolution + pooling
conv_128 = conv_block(inputs, FILTER_SIZE, FILTER_NUM, dropout_rate, batch_norm)
pool_64 = layers.MaxPooling2D(pool_size=(2,2))(conv_128)
# DownRes 2
conv_64 = conv_block(pool_64, FILTER_SIZE, 2*FILTER_NUM, dropout_rate, batch_norm)
pool_32 = layers.MaxPooling2D(pool_size=(2,2))(conv_64)
# DownRes 3
conv_32 = conv_block(pool_32, FILTER_SIZE, 4*FILTER_NUM, dropout_rate, batch_norm)
pool_16 = layers.MaxPooling2D(pool_size=(2,2))(conv_32)
# DownRes 4
conv_16 = conv_block(pool_16, FILTER_SIZE, 8*FILTER_NUM, dropout_rate, batch_norm)
pool_8 = layers.MaxPooling2D(pool_size=(2,2))(conv_16)
# DownRes 5, convolution only
conv_8 = conv_block(pool_8, FILTER_SIZE, 16*FILTER_NUM, dropout_rate, batch_norm)
# Upsampling layers
up_16 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(conv_8)
up_16 = layers.concatenate([up_16, conv_16], axis=3)
up_conv_16 = conv_block(up_16, FILTER_SIZE, 8*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 7
up_32 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_16)
up_32 = layers.concatenate([up_32, conv_32], axis=3)
up_conv_32 = conv_block(up_32, FILTER_SIZE, 4*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 8
up_64 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_32)
up_64 = layers.concatenate([up_64, conv_64], axis=3)
up_conv_64 = conv_block(up_64, FILTER_SIZE, 2*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 9
up_128 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_64)
up_128 = layers.concatenate([up_128, conv_128], axis=3)
up_conv_128 = conv_block(up_128, FILTER_SIZE, FILTER_NUM, dropout_rate, batch_norm)
# 1*1 convolutional layers
conv_final = layers.Conv2D(NUM_CLASSES, kernel_size=(1,1))(up_conv_128)
conv_final = layers.BatchNormalization(axis=3)(conv_final)
conv_final = layers.Activation('sigmoid')(conv_final) #Change to softmax for multichannel
# Model
model = models.Model(inputs, conv_final, name="UNet")
#print(model.summary())
return model
def Attention_UNet(input_shape, NUM_CLASSES=1, dropout_rate=0.0, batch_norm=True):
"""
Attention UNet
"""
# network structure
FILTER_NUM = 64 # number of basic filters for the first layer
FILTER_SIZE = 3 # size of the convolutional filter
UP_SAMP_SIZE = 2 # size of upsampling filters
inputs = layers.Input(input_shape, dtype=tf.float32)
# Downsampling layers
# DownRes 1, convolution + pooling
conv_128 = conv_block(inputs, FILTER_SIZE, FILTER_NUM, dropout_rate, batch_norm)
pool_64 = layers.MaxPooling2D(pool_size=(2,2))(conv_128)
# DownRes 2
conv_64 = conv_block(pool_64, FILTER_SIZE, 2*FILTER_NUM, dropout_rate, batch_norm)
pool_32 = layers.MaxPooling2D(pool_size=(2,2))(conv_64)
# DownRes 3
conv_32 = conv_block(pool_32, FILTER_SIZE, 4*FILTER_NUM, dropout_rate, batch_norm)
pool_16 = layers.MaxPooling2D(pool_size=(2,2))(conv_32)
# DownRes 4
conv_16 = conv_block(pool_16, FILTER_SIZE, 8*FILTER_NUM, dropout_rate, batch_norm)
pool_8 = layers.MaxPooling2D(pool_size=(2,2))(conv_16)
# DownRes 5, convolution only
conv_8 = conv_block(pool_8, FILTER_SIZE, 16*FILTER_NUM, dropout_rate, batch_norm)
# Upsampling layers
# UpRes 6, attention gated concatenation + upsampling + double residual convolution
gating_16 = gating_signal(conv_8, 8*FILTER_NUM, batch_norm)
att_16 = attention_block(conv_16, gating_16, 8*FILTER_NUM)
up_16 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(conv_8)
up_16 = layers.concatenate([up_16, att_16], axis=3)
up_conv_16 = conv_block(up_16, FILTER_SIZE, 8*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 7
gating_32 = gating_signal(up_conv_16, 4*FILTER_NUM, batch_norm)
att_32 = attention_block(conv_32, gating_32, 4*FILTER_NUM)
up_32 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_16)
up_32 = layers.concatenate([up_32, att_32], axis=3)
up_conv_32 = conv_block(up_32, FILTER_SIZE, 4*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 8
gating_64 = gating_signal(up_conv_32, 2*FILTER_NUM, batch_norm)
att_64 = attention_block(conv_64, gating_64, 2*FILTER_NUM)
up_64 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_32)
up_64 = layers.concatenate([up_64, att_64], axis=3)
up_conv_64 = conv_block(up_64, FILTER_SIZE, 2*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 9
gating_128 = gating_signal(up_conv_64, FILTER_NUM, batch_norm)
att_128 = attention_block(conv_128, gating_128, FILTER_NUM)
up_128 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_64)
up_128 = layers.concatenate([up_128, att_128], axis=3)
up_conv_128 = conv_block(up_128, FILTER_SIZE, FILTER_NUM, dropout_rate, batch_norm)
# 1*1 convolutional layers
conv_final = layers.Conv2D(NUM_CLASSES, kernel_size=(1,1))(up_conv_128)
conv_final = layers.BatchNormalization(axis=3)(conv_final)
conv_final = layers.Activation('sigmoid')(conv_final) #Change to softmax for multichannel
# Model integration
model = models.Model(inputs, conv_final, name="Attention_UNet")
return model
def Attention_ResUNet(input_shape, NUM_CLASSES=1, dropout_rate=0.0, batch_norm=True):
"""
Attention Residual UNet
"""
# network structure
FILTER_NUM = 64 # number of basic filters for the first layer
FILTER_SIZE = 3 # size of the convolutional filter
UP_SAMP_SIZE = 2 # size of upsampling filters
# input data
# dimension of the image depth
inputs = layers.Input(input_shape, dtype=tf.float32)
axis = 3
# Downsampling layers
# DownRes 1, double residual convolution + pooling
conv_128 = res_conv_block(inputs, FILTER_SIZE, FILTER_NUM, dropout_rate, batch_norm)
pool_64 = layers.MaxPooling2D(pool_size=(2,2))(conv_128)
# DownRes 2
conv_64 = res_conv_block(pool_64, FILTER_SIZE, 2*FILTER_NUM, dropout_rate, batch_norm)
pool_32 = layers.MaxPooling2D(pool_size=(2,2))(conv_64)
# DownRes 3
conv_32 = res_conv_block(pool_32, FILTER_SIZE, 4*FILTER_NUM, dropout_rate, batch_norm)
pool_16 = layers.MaxPooling2D(pool_size=(2,2))(conv_32)
# DownRes 4
conv_16 = res_conv_block(pool_16, FILTER_SIZE, 8*FILTER_NUM, dropout_rate, batch_norm)
pool_8 = layers.MaxPooling2D(pool_size=(2,2))(conv_16)
# DownRes 5, convolution only
conv_8 = res_conv_block(pool_8, FILTER_SIZE, 16*FILTER_NUM, dropout_rate, batch_norm)
# Upsampling layers
# UpRes 6, attention gated concatenation + upsampling + double residual convolution
gating_16 = gating_signal(conv_8, 8*FILTER_NUM, batch_norm)
att_16 = attention_block(conv_16, gating_16, 8*FILTER_NUM)
up_16 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(conv_8)
up_16 = layers.concatenate([up_16, att_16], axis=axis)
up_conv_16 = res_conv_block(up_16, FILTER_SIZE, 8*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 7
gating_32 = gating_signal(up_conv_16, 4*FILTER_NUM, batch_norm)
att_32 = attention_block(conv_32, gating_32, 4*FILTER_NUM)
up_32 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_16)
up_32 = layers.concatenate([up_32, att_32], axis=axis)
up_conv_32 = res_conv_block(up_32, FILTER_SIZE, 4*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 8
gating_64 = gating_signal(up_conv_32, 2*FILTER_NUM, batch_norm)
att_64 = attention_block(conv_64, gating_64, 2*FILTER_NUM)
up_64 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_32)
up_64 = layers.concatenate([up_64, att_64], axis=axis)
up_conv_64 = res_conv_block(up_64, FILTER_SIZE, 2*FILTER_NUM, dropout_rate, batch_norm)
# UpRes 9
gating_128 = gating_signal(up_conv_64, FILTER_NUM, batch_norm)
att_128 = attention_block(conv_128, gating_128, FILTER_NUM)
up_128 = layers.UpSampling2D(size=(UP_SAMP_SIZE, UP_SAMP_SIZE), data_format="channels_last")(up_conv_64)
up_128 = layers.concatenate([up_128, att_128], axis=axis)
up_conv_128 = res_conv_block(up_128, FILTER_SIZE, FILTER_NUM, dropout_rate, batch_norm)
# 1*1 convolutional layers
conv_final = layers.Conv2D(NUM_CLASSES, kernel_size=(1,1))(up_conv_128)
conv_final = layers.BatchNormalization(axis=axis)(conv_final)
conv_final = layers.Activation('sigmoid')(conv_final) #Change to softmax for multichannel
# Model integration
model = models.Model(inputs, conv_final, name="AttentionResUNet")
return model
def SA_UNet(input_shape, block_size=7, keep_prob=0.9, start_neurons=16):
"""
Spatial Attention U-Net
"""
inputs = layers.Input(input_shape)
conv1 = layers.Conv2D(start_neurons * 1, (3, 3), activation=None, padding="same")(inputs)
conv1 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(conv1)
conv1= layers.BatchNormalization()(conv1)
conv1 = layers.Activation('relu')(conv1)
conv1 = layers.Conv2D(start_neurons * 1, (3, 3), activation=None, padding="same")(conv1)
conv1 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(conv1)
conv1 = layers.BatchNormalization()(conv1)
conv1 = layers.Activation('relu')(conv1)
pool1 = layers.MaxPooling2D((2, 2))(conv1)
conv2 = layers.Conv2D(start_neurons * 2, (3, 3), activation=None, padding="same")(pool1)
conv2 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(conv2)
conv2 = layers.BatchNormalization()(conv2)
conv2 = layers.Activation('relu')(conv2)
conv2 = layers.Conv2D(start_neurons * 2, (3, 3), activation=None, padding="same")(conv2)
conv2 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(conv2)
conv2 = layers.BatchNormalization()(conv2)
conv2 = layers.Activation('relu')(conv2)
pool2 = layers.MaxPooling2D((2, 2))(conv2)
conv3 = layers.Conv2D(start_neurons * 4, (3, 3), activation=None, padding="same")(pool2)
conv3 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(conv3)
conv3 = layers.BatchNormalization()(conv3)
conv3 = layers.Activation('relu')(conv3)
conv3 = layers.Conv2D(start_neurons * 4, (3, 3), activation=None, padding="same")(conv3)
conv3 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(conv3)
conv3 = layers.BatchNormalization()(conv3)
conv3 = layers.Activation('relu')(conv3)
pool3 = layers.MaxPooling2D((2, 2))(conv3)
convm = layers.Conv2D(start_neurons * 8, (3, 3), activation=None, padding="same")(pool3)
convm = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(convm)
convm = layers.BatchNormalization()(convm)
convm = layers.Activation('relu')(convm)
convm = spatial_attention(convm)
convm = layers.Conv2D(start_neurons * 8, (3, 3), activation=None, padding="same")(convm)
convm = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(convm)
convm = layers.BatchNormalization()(convm)
convm = layers.Activation('relu')(convm)
deconv3 = layers.Conv2DTranspose(start_neurons * 4, (3, 3), strides=(2, 2), padding="same")(convm)
uconv3 = layers.concatenate([deconv3, conv3])
uconv3 = layers.Conv2D(start_neurons * 4, (3, 3), activation=None, padding="same")(uconv3)
uconv3 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(uconv3)
uconv3 = layers.BatchNormalization()(uconv3)
uconv3 = layers.Activation('relu')(uconv3)
uconv3 = layers.Conv2D(start_neurons * 4, (3, 3), activation=None, padding="same")(uconv3)
uconv3 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(uconv3)
uconv3 = layers.BatchNormalization()(uconv3)
uconv3 = layers.Activation('relu')(uconv3)
deconv2 = layers.Conv2DTranspose(start_neurons * 2, (3, 3), strides=(2, 2), padding="same")(uconv3)
uconv2 = layers.concatenate([deconv2, conv2])
uconv2 = layers.Conv2D(start_neurons * 2, (3, 3), activation=None, padding="same")(uconv2)
uconv2 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(uconv2)
uconv2 = layers.BatchNormalization()(uconv2)
uconv2 = layers.Activation('relu')(uconv2)
uconv2 = layers.Conv2D(start_neurons * 2, (3, 3), activation=None, padding="same")(uconv2)
uconv2 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(uconv2)
uconv2 = layers.BatchNormalization()(uconv2)
uconv2 = layers.Activation('relu')(uconv2)
deconv1 = layers.Conv2DTranspose(start_neurons * 1, (3, 3), strides=(2, 2), padding="same")(uconv2)
uconv1 = layers.concatenate([deconv1, conv1])
uconv1 = layers.Conv2D(start_neurons * 1, (3, 3), activation=None, padding="same")(uconv1)
uconv1 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(uconv1)
uconv1 = layers.BatchNormalization()(uconv1)
uconv1 = layers.Activation('relu')(uconv1)
uconv1 = layers.Conv2D(start_neurons * 1, (3, 3), activation=None, padding="same")(uconv1)
uconv1 = DropBlock2D(block_size=block_size, keep_prob=keep_prob)(uconv1)
uconv1 = layers.BatchNormalization()(uconv1)
uconv1 = layers.Activation('relu')(uconv1)
output_layer_noActi = layers.Conv2D(1, (1, 1), padding="same", activation=None)(uconv1)
output_layer = layers.Activation('sigmoid')(output_layer_noActi)
model = models.Model(inputs=inputs, outputs=output_layer, name="SA_UNet")
# model.compile(optimizer=Adam(learning_rate=1e-3),
# loss='binary_crossentropy',
# metrics=['accuracy',[dice_coef, mse_euclidean])
return model
# fuctions for model analysis
def plot_history(history, title=""):
"""
Plots a trainning hsitory
"""
fig = plt.figure(figsize=(16, 8))
fig.suptitle(title, fontsize=16)
plt.subplot(1, 3, 1)
loss = history.history['loss']
val_loss = history.history['val_loss']
epochs = range(1, len(loss) + 1)
plt.plot(epochs, loss, 'y', label='Training loss')
plt.plot(epochs, val_loss, 'r', label='Validation loss')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.ylim(0, 0.5)
plt.legend()
plt.grid()
plt.subplot(1, 3, 2)
plt.plot(epochs, history.history['mae_euclidean'], 'y', label='mae_euclidean')
plt.plot(epochs, history.history['val_mae_euclidean'], 'r', label='val_mae_euclidean')
plt.title('MAE euclidean distances from borders')
plt.xlabel('Epochs')
plt.ylabel('MAE euclidean')
plt.ylim(0, 10)
plt.legend()
plt.grid()
plt.subplot(1, 3, 3)
plt.plot(epochs, history.history['jacard_coef'], 'y', label='jacard_coef')
plt.plot(epochs, history.history['val_jacard_coef'], 'r', label='val_jacard_coef')
plt.title('Training and validation jacard coef')
plt.xlabel('Epochs')
plt.ylabel('jacard coef')
plt.ylim(0, 0.8)
plt.legend()
plt.grid()
plt.show()
def visual_eval(model, test_img, mask_expected=None, thresh = 0.5, show=True):
"""
Provides visual evaluation for the performance of a model.
"""
test_img = img_as_float32(test_img).copy()
original_img = test_img[:, :, 0].copy()
X = np.expand_dims(test_img, axis=0)
# X = np.expand_dims(X, axis=-1)
mask_predicted = model.predict(X)[0, :, :, 0]
mask_predicted = mask_predicted > thresh # thresholding
mask_expected = mask_expected[:, :, 0]
img_review = color.gray2rgb(original_img)
if mask_expected is not None:
mask_expected = mask_expected.astype(bool)
img_review[mask_expected & mask_predicted] = (0, 1, 0) # tp, green
img_review[mask_expected & ~mask_predicted] = (0, 0, 1) # fn, low recall score
img_review[~mask_expected & mask_predicted] = (1, 1, 0) # fp, low precision score
else:
img_review[mask_predicted] = (0, 1, 0) # predicted,
# adding legend
cv2.putText(img_review, "TP", (5, 10), cv2.FONT_HERSHEY_SIMPLEX, 0.3, (0, 1, 0), 1)
cv2.putText(img_review, "FP", (18, 10), cv2.FONT_HERSHEY_SIMPLEX, 0.3, (1, 1, 0), 1)
cv2.putText(img_review, "FN", (31, 10), cv2.FONT_HERSHEY_SIMPLEX, 0.3, (0, 0, 1), 1)
if show:
plt.figure(figsize=(12,12))
plt.imshow(img_review)
plt.title("Mark-up picture")
plt.show()
return img_review #, mask_expected, mask_predicted
def plot_train_val(model, idx_train=None, idx_val=None):
# plot prediction accuracy on train and validation dataset
if idx_train == None:
idx_train = np.random.randint(0, X_train.shape[0])
if idx_val == None:
idx_val = np.random.randint(0, X_val.shape[0])
plt.figure(figsize=(18,10))
plt.subplot(1, 2, 1)
im = visual_eval(model=model, test_img=X_train[idx_train], mask_expected=y_train[idx_train], show=False)
plt.imshow(im)
plt.title(f"Train image {idx_train}")
plt.subplot(1, 2, 2)
im = visual_eval(model=model, test_img=X_val[idx_val], mask_expected=y_val[idx_val], show=False)
plt.imshow(im)
plt.title(f"Validation image {idx_val}")
plt.show()
Train and test datasets are organized in separate folders. The dataset is prepared from one image with scale 500x and a second image 1000x. The images are sliced into patches with 256x176. Augmentation is applied to all patches by applying median filter. The patches are divided so that patch and its augmentation are either in train, validation or test; this is done to avoid information leak. Augmentation could be automated with data generator, but it is possible some augmentation settings to produce unrealistic images. It is strongly recommended to check the visual perception of the augmentation if such approach is considered.
def create_augmneted_img(path):
"""
Function to create augmented images by appling median filter
"""
for f_name in os.listdir(path):
if f_name.split("_")[-1] == 'image.tif' and f_name.startswith("patch1000_"):
print(f"Working on {f_name}")
img = io.imread(train_test_path + r"\\" + f_name)
img = img_as_float32(img)
aug = filters.median(img)
new_name = str(train_test_path + r"\\" + f_name).replace("patch1000_", "patch1000_aug")
io.imsave(new_name, img_as_ubyte(aug))
mask_name = f_name.replace("image", "mask")
mask = io.imread(train_test_path + r"\\" + mask_name)
new_name = new_name.replace("_image", "_mask")
io.imsave(new_name, img_as_ubyte(mask))
#create_augmneted_img(r"datasets/train") # already done
def load_data(path):
"""
loads images and masks from a folder
all images and mask should be 256x176 (w x h)
"""
imgs = []
masks = []
for f_name in os.listdir(path):
if f_name.split("_")[-1] == 'image.tif':
# print(f"Working on {f_name}")
img = io.imread(path + r"\\" + f_name)
assert img.shape == (176, 256)
img = img_as_float32(img)
imgs.append(img)
mask_name = f_name.replace("image", "mask")
mask = io.imread(path + r"\\" + mask_name)
mask = mask // 255
masks.append(mask)
imgs = np.expand_dims(np.array(imgs), -1)
masks = np.expand_dims(np.array(masks), -1)
return imgs, masks
X_train, y_train = load_data("datasets/train")
X_val, y_val = load_data("datasets/validation")
X_test, y_test = load_data("datasets/test")
X_train = tf.cast(X_train, tf.float16)
X_val = tf.cast(X_val, tf.float16)
X_test = tf.cast(X_test, tf.float16)
X_train.shape, X_val.shape, X_test.shape
(TensorShape([32, 176, 256, 1]), TensorShape([8, 176, 256, 1]), TensorShape([4, 176, 256, 1]))
# sanity check for mating of mages and masks; run a couple of times
# t = np.random.randint(X_train.shape[0])
t = np.random.randint(X_train.shape[0])
plt.figure(figsize=(12,5))
plt.subplot(1, 2, 1)
plt.imshow(tf.cast(X_train[t], float), cmap='gray')
plt.subplot(1, 2, 2)
plt.imshow(y_train[t], cmap='gray')
plt.show()
total_border_pixels = np.sum(y_train)
total_pixels = y_train.shape[0] * y_train.shape[1] * y_train.shape[2] * y_train.shape[3]
border_pix_ratio = total_border_pixels / total_pixels
print(f"Grain border pixels / all pixels = {border_pix_ratio}")
Grain border pixels / all pixels = 0.07478609952059659
IMG_HEIGHT = X_train.shape[1]
IMG_WIDTH = X_train.shape[2]
IMG_CHANNELS = X_train.shape[3]
input_shape = (IMG_HEIGHT,IMG_WIDTH,IMG_CHANNELS)
input_shape
(176, 256, 1)
Several iterations and hyperparameter searches have been done in the background. The next sections will start with setting known to produce good results.
Objectives:
As a starting point, the following parameters have been found to be optimal:block_size=19, keep_prob=0.8, start_neurons=20
# SA Unet1
tf.random.set_seed(14)
model_sa_unet1 = SA_UNet(input_shape, block_size=19, keep_prob=0.8, start_neurons=20)
model_sa_unet1.compile(optimizer=Adam(learning_rate=1e-2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_sa_unet1 = model_sa_unet1.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8, # no resources for 16
epochs=200
)
print(f"Model trained for {time.time() - start}s")
model_sa_unet1.save("2022-02-17 SA-UNet1 200epochs.hdf5")
Epoch 1/200 4/4 [==============================] - 10s 479ms/step - loss: 0.8405 - mae_euclidean: 5.2724 - jacard_coef: 0.0874 - val_loss: 0.8714 - val_mae_euclidean: 5.8165 - val_jacard_coef: 0.0687 Epoch 2/200 4/4 [==============================] - 1s 374ms/step - loss: 0.7713 - mae_euclidean: 3.6544 - jacard_coef: 0.1293 - val_loss: 0.8798 - val_mae_euclidean: 5.6584 - val_jacard_coef: 0.0640 Epoch 3/200 4/4 [==============================] - 1s 376ms/step - loss: 0.7341 - mae_euclidean: 3.3476 - jacard_coef: 0.1534 - val_loss: 0.9941 - val_mae_euclidean: 121.6934 - val_jacard_coef: 0.0030 Epoch 4/200 4/4 [==============================] - 1s 373ms/step - loss: 0.6905 - mae_euclidean: 2.9527 - jacard_coef: 0.1831 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6859e-05 Epoch 5/200 4/4 [==============================] - 1s 374ms/step - loss: 0.6529 - mae_euclidean: 2.8639 - jacard_coef: 0.2107 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6988e-05 Epoch 6/200 4/4 [==============================] - 1s 372ms/step - loss: 0.6078 - mae_euclidean: 2.5158 - jacard_coef: 0.2441 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.2680e-05 Epoch 7/200 4/4 [==============================] - 1s 376ms/step - loss: 0.5741 - mae_euclidean: 2.2905 - jacard_coef: 0.2715 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8226e-05 Epoch 8/200 4/4 [==============================] - 1s 376ms/step - loss: 0.5310 - mae_euclidean: 1.9890 - jacard_coef: 0.3067 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.6139e-05 Epoch 9/200 4/4 [==============================] - 1s 377ms/step - loss: 0.4989 - mae_euclidean: 1.8818 - jacard_coef: 0.3346 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8324e-05 Epoch 10/200 4/4 [==============================] - 1s 378ms/step - loss: 0.4689 - mae_euclidean: 1.5774 - jacard_coef: 0.3616 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7041e-05 Epoch 11/200 4/4 [==============================] - 1s 373ms/step - loss: 0.4546 - mae_euclidean: 1.3611 - jacard_coef: 0.3754 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7742e-05 Epoch 12/200 4/4 [==============================] - 1s 376ms/step - loss: 0.4356 - mae_euclidean: 1.6012 - jacard_coef: 0.3932 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6924e-05 Epoch 13/200 4/4 [==============================] - 1s 375ms/step - loss: 0.4289 - mae_euclidean: 1.5228 - jacard_coef: 0.3997 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6965e-05 Epoch 14/200 4/4 [==============================] - 1s 376ms/step - loss: 0.4259 - mae_euclidean: 1.5623 - jacard_coef: 0.4027 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6926e-05 Epoch 15/200 4/4 [==============================] - 1s 374ms/step - loss: 0.4099 - mae_euclidean: 1.5322 - jacard_coef: 0.4186 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8929e-05 Epoch 16/200 4/4 [==============================] - 1s 375ms/step - loss: 0.4056 - mae_euclidean: 1.4564 - jacard_coef: 0.4229 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.1543e-05 Epoch 17/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3996 - mae_euclidean: 1.4724 - jacard_coef: 0.4290 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.9352e-05 Epoch 18/200 4/4 [==============================] - 1s 374ms/step - loss: 0.4029 - mae_euclidean: 1.5331 - jacard_coef: 0.4258 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.2381e-04 Epoch 19/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3967 - mae_euclidean: 1.5450 - jacard_coef: 0.4320 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.6911e-05 Epoch 20/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3903 - mae_euclidean: 1.3981 - jacard_coef: 0.4388 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1258e-04 Epoch 21/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3815 - mae_euclidean: 1.2309 - jacard_coef: 0.4479 - val_loss: 0.9896 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0053 Epoch 22/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3747 - mae_euclidean: 1.4530 - jacard_coef: 0.4550 - val_loss: 0.9994 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.3946e-04 Epoch 23/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3825 - mae_euclidean: 1.4358 - jacard_coef: 0.4473 - val_loss: 0.9995 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.5189e-04 Epoch 24/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3696 - mae_euclidean: 1.3172 - jacard_coef: 0.4604 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.4790e-05 Epoch 25/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3672 - mae_euclidean: 1.2925 - jacard_coef: 0.4629 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.8606e-05 Epoch 26/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3596 - mae_euclidean: 1.1674 - jacard_coef: 0.4711 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.2000e-05 Epoch 27/200 4/4 [==============================] - 1s 372ms/step - loss: 0.3683 - mae_euclidean: 1.3403 - jacard_coef: 0.4622 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.9069e-05 Epoch 28/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3690 - mae_euclidean: 1.6199 - jacard_coef: 0.4620 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.5954e-05 Epoch 29/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3567 - mae_euclidean: 1.1713 - jacard_coef: 0.4742 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.8024e-05 Epoch 30/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3573 - mae_euclidean: 1.2792 - jacard_coef: 0.4738 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.4576e-05 Epoch 31/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3467 - mae_euclidean: 1.0636 - jacard_coef: 0.4853 - val_loss: 0.9995 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.5842e-04 Epoch 32/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3529 - mae_euclidean: 1.2393 - jacard_coef: 0.4784 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.9170e-05 Epoch 33/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3549 - mae_euclidean: 1.2285 - jacard_coef: 0.4763 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.6148e-05 Epoch 34/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3449 - mae_euclidean: 1.1468 - jacard_coef: 0.4877 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.3426e-04 Epoch 35/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3549 - mae_euclidean: 1.4855 - jacard_coef: 0.4774 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.9960e-04 Epoch 36/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3306 - mae_euclidean: 1.0508 - jacard_coef: 0.5031 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.8476e-04 Epoch 37/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3297 - mae_euclidean: 1.1267 - jacard_coef: 0.5041 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.3156e-05 Epoch 38/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3298 - mae_euclidean: 1.0918 - jacard_coef: 0.5042 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.3458e-05 Epoch 39/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3234 - mae_euclidean: 1.1033 - jacard_coef: 0.5114 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.5992e-05 Epoch 40/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3261 - mae_euclidean: 1.2548 - jacard_coef: 0.5083 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.0925e-04 Epoch 41/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3270 - mae_euclidean: 1.1142 - jacard_coef: 0.5075 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.9689e-05 Epoch 42/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3267 - mae_euclidean: 1.2009 - jacard_coef: 0.5076 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.2712e-04 Epoch 43/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3183 - mae_euclidean: 1.0474 - jacard_coef: 0.5173 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.6183e-04 Epoch 44/200 4/4 [==============================] - 1s 380ms/step - loss: 0.3276 - mae_euclidean: 1.1611 - jacard_coef: 0.5068 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.6311e-05 Epoch 45/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3270 - mae_euclidean: 1.1466 - jacard_coef: 0.5074 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0383e-05 Epoch 46/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3403 - mae_euclidean: 1.1252 - jacard_coef: 0.4926 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.4382e-05 Epoch 47/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3239 - mae_euclidean: 1.0483 - jacard_coef: 0.5109 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.5168e-05 Epoch 48/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3180 - mae_euclidean: 1.1810 - jacard_coef: 0.5178 - val_loss: 0.9990 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0349e-04 Epoch 49/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3216 - mae_euclidean: 1.0358 - jacard_coef: 0.5135 - val_loss: 0.9990 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.9472e-04 Epoch 50/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3167 - mae_euclidean: 1.1031 - jacard_coef: 0.5191 - val_loss: 0.9994 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.4288e-04 Epoch 51/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3159 - mae_euclidean: 1.0458 - jacard_coef: 0.5200 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.1818e-04 Epoch 52/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3111 - mae_euclidean: 0.9833 - jacard_coef: 0.5254 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.2130e-04 Epoch 53/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3123 - mae_euclidean: 1.0192 - jacard_coef: 0.5240 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.9369e-04 Epoch 54/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3082 - mae_euclidean: 1.0124 - jacard_coef: 0.5289 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.8522e-04 Epoch 55/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3057 - mae_euclidean: 0.9631 - jacard_coef: 0.5318 - val_loss: 0.9995 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.8492e-04 Epoch 56/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3085 - mae_euclidean: 1.0028 - jacard_coef: 0.5285 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.0432e-04 Epoch 57/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3068 - mae_euclidean: 1.0010 - jacard_coef: 0.5306 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.7910e-04 Epoch 58/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3041 - mae_euclidean: 1.0141 - jacard_coef: 0.5338 - val_loss: 0.9990 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.2869e-04 Epoch 59/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3069 - mae_euclidean: 1.0177 - jacard_coef: 0.5304 - val_loss: 0.9996 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.1351e-04 Epoch 60/200 4/4 [==============================] - 1s 374ms/step - loss: 0.3173 - mae_euclidean: 1.0753 - jacard_coef: 0.5186 - val_loss: 0.9994 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.4152e-04 Epoch 61/200 4/4 [==============================] - 1s 378ms/step - loss: 0.3019 - mae_euclidean: 1.0748 - jacard_coef: 0.5363 - val_loss: 0.9993 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 3.7767e-04 Epoch 62/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3090 - mae_euclidean: 1.0858 - jacard_coef: 0.5281 - val_loss: 0.9990 - val_mae_euclidean: 13835056955770535936.0000 - val_jacard_coef: 5.2017e-04 Epoch 63/200 4/4 [==============================] - 1s 380ms/step - loss: 0.3164 - mae_euclidean: 1.1287 - jacard_coef: 0.5193 - val_loss: 0.9994 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 3.3474e-04 Epoch 64/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3293 - mae_euclidean: 1.0888 - jacard_coef: 0.5055 - val_loss: 0.9987 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 6.7962e-04 Epoch 65/200 4/4 [==============================] - 1s 377ms/step - loss: 0.3135 - mae_euclidean: 1.0868 - jacard_coef: 0.5227 - val_loss: 0.9989 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 5.9354e-04 Epoch 66/200 4/4 [==============================] - 1s 376ms/step - loss: 0.3235 - mae_euclidean: 1.1420 - jacard_coef: 0.5113 - val_loss: 0.9991 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 4.9040e-04 Epoch 67/200 4/4 [==============================] - 1s 375ms/step - loss: 0.3116 - mae_euclidean: 1.1812 - jacard_coef: 0.5249 - val_loss: 0.9967 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0017 Epoch 68/200 4/4 [==============================] - 1s 372ms/step - loss: 0.3095 - mae_euclidean: 1.0380 - jacard_coef: 0.5273 - val_loss: 0.9940 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0030 Epoch 69/200 4/4 [==============================] - 1s 373ms/step - loss: 0.3020 - mae_euclidean: 1.1099 - jacard_coef: 0.5365 - val_loss: 0.9979 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0011 Epoch 70/200 4/4 [==============================] - 1s 378ms/step - loss: 0.2963 - mae_euclidean: 0.9936 - jacard_coef: 0.5429 - val_loss: 0.9987 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 6.5136e-04 Epoch 71/200 4/4 [==============================] - 2s 394ms/step - loss: 0.3028 - mae_euclidean: 1.2029 - jacard_coef: 0.5356 - val_loss: 0.9980 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0010 Epoch 72/200 4/4 [==============================] - 2s 401ms/step - loss: 0.3048 - mae_euclidean: 1.1255 - jacard_coef: 0.5331 - val_loss: 0.9986 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 7.2250e-04 Epoch 73/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2945 - mae_euclidean: 1.0284 - jacard_coef: 0.5450 - val_loss: 0.9986 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 7.2496e-04 Epoch 74/200 4/4 [==============================] - 2s 396ms/step - loss: 0.3086 - mae_euclidean: 1.0777 - jacard_coef: 0.5285 - val_loss: 0.9986 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 7.0706e-04 Epoch 75/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2924 - mae_euclidean: 1.0098 - jacard_coef: 0.5476 - val_loss: 0.9874 - val_mae_euclidean: 32.9031 - val_jacard_coef: 0.0064 Epoch 76/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2940 - mae_euclidean: 1.0228 - jacard_coef: 0.5458 - val_loss: 0.9951 - val_mae_euclidean: 56.3397 - val_jacard_coef: 0.0025 Epoch 77/200 4/4 [==============================] - 2s 397ms/step - loss: 0.3053 - mae_euclidean: 1.3810 - jacard_coef: 0.5331 - val_loss: 0.9901 - val_mae_euclidean: 47.0294 - val_jacard_coef: 0.0050 Epoch 78/200 4/4 [==============================] - 2s 396ms/step - loss: 0.3022 - mae_euclidean: 1.2327 - jacard_coef: 0.5359 - val_loss: 0.9946 - val_mae_euclidean: 56.1145 - val_jacard_coef: 0.0027 Epoch 79/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2932 - mae_euclidean: 0.9684 - jacard_coef: 0.5468 - val_loss: 0.9956 - val_mae_euclidean: 77.2754 - val_jacard_coef: 0.0022 Epoch 80/200 4/4 [==============================] - 2s 399ms/step - loss: 0.3009 - mae_euclidean: 1.1001 - jacard_coef: 0.5379 - val_loss: 0.9836 - val_mae_euclidean: 35.1385 - val_jacard_coef: 0.0083 Epoch 81/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2968 - mae_euclidean: 0.9880 - jacard_coef: 0.5424 - val_loss: 0.9612 - val_mae_euclidean: 23.4549 - val_jacard_coef: 0.0198 Epoch 82/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2856 - mae_euclidean: 1.0051 - jacard_coef: 0.5558 - val_loss: 0.9851 - val_mae_euclidean: 36.7373 - val_jacard_coef: 0.0075 Epoch 83/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2850 - mae_euclidean: 1.0334 - jacard_coef: 0.5565 - val_loss: 0.9665 - val_mae_euclidean: 25.8636 - val_jacard_coef: 0.0170 Epoch 84/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2883 - mae_euclidean: 0.9375 - jacard_coef: 0.5527 - val_loss: 0.9746 - val_mae_euclidean: 26.3080 - val_jacard_coef: 0.0129 Epoch 85/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2908 - mae_euclidean: 0.9924 - jacard_coef: 0.5495 - val_loss: 0.9658 - val_mae_euclidean: 20.6509 - val_jacard_coef: 0.0174 Epoch 86/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2890 - mae_euclidean: 1.1424 - jacard_coef: 0.5517 - val_loss: 0.9250 - val_mae_euclidean: 18.0836 - val_jacard_coef: 0.0390 Epoch 87/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2856 - mae_euclidean: 0.9364 - jacard_coef: 0.5559 - val_loss: 0.9091 - val_mae_euclidean: 13.3445 - val_jacard_coef: 0.0476 Epoch 88/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2819 - mae_euclidean: 1.0285 - jacard_coef: 0.5603 - val_loss: 0.8696 - val_mae_euclidean: 11.2002 - val_jacard_coef: 0.0698 Epoch 89/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2924 - mae_euclidean: 1.2375 - jacard_coef: 0.5481 - val_loss: 0.8157 - val_mae_euclidean: 9.3213 - val_jacard_coef: 0.1015 Epoch 90/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2864 - mae_euclidean: 1.0620 - jacard_coef: 0.5548 - val_loss: 0.8483 - val_mae_euclidean: 9.8455 - val_jacard_coef: 0.0821 Epoch 91/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2825 - mae_euclidean: 0.9451 - jacard_coef: 0.5594 - val_loss: 0.8854 - val_mae_euclidean: 11.5678 - val_jacard_coef: 0.0608 Epoch 92/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2819 - mae_euclidean: 0.9867 - jacard_coef: 0.5603 - val_loss: 0.8284 - val_mae_euclidean: 9.2632 - val_jacard_coef: 0.0939 Epoch 93/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2922 - mae_euclidean: 0.9411 - jacard_coef: 0.5480 - val_loss: 0.8664 - val_mae_euclidean: 9.8835 - val_jacard_coef: 0.0716 Epoch 94/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2870 - mae_euclidean: 1.0717 - jacard_coef: 0.5543 - val_loss: 0.8010 - val_mae_euclidean: 8.5617 - val_jacard_coef: 0.1105 Epoch 95/200 4/4 [==============================] - 2s 398ms/step - loss: 0.3042 - mae_euclidean: 1.2186 - jacard_coef: 0.5358 - val_loss: 0.8011 - val_mae_euclidean: 8.2166 - val_jacard_coef: 0.1104 Epoch 96/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2840 - mae_euclidean: 1.0378 - jacard_coef: 0.5576 - val_loss: 0.7772 - val_mae_euclidean: 8.0867 - val_jacard_coef: 0.1254 Epoch 97/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2853 - mae_euclidean: 1.0830 - jacard_coef: 0.5562 - val_loss: 0.7742 - val_mae_euclidean: 7.1573 - val_jacard_coef: 0.1273 Epoch 98/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2797 - mae_euclidean: 1.0316 - jacard_coef: 0.5630 - val_loss: 0.7634 - val_mae_euclidean: 6.5987 - val_jacard_coef: 0.1342 Epoch 99/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2819 - mae_euclidean: 1.1087 - jacard_coef: 0.5602 - val_loss: 0.7361 - val_mae_euclidean: 6.7281 - val_jacard_coef: 0.1520 Epoch 100/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2736 - mae_euclidean: 0.9992 - jacard_coef: 0.5704 - val_loss: 0.6259 - val_mae_euclidean: 5.0929 - val_jacard_coef: 0.2301 Epoch 101/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2805 - mae_euclidean: 1.1450 - jacard_coef: 0.5621 - val_loss: 0.6335 - val_mae_euclidean: 5.2166 - val_jacard_coef: 0.2244 Epoch 102/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2810 - mae_euclidean: 1.0416 - jacard_coef: 0.5616 - val_loss: 0.6599 - val_mae_euclidean: 5.5884 - val_jacard_coef: 0.2049 Epoch 103/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2738 - mae_euclidean: 0.9386 - jacard_coef: 0.5702 - val_loss: 0.6655 - val_mae_euclidean: 5.1590 - val_jacard_coef: 0.2008 Epoch 104/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2786 - mae_euclidean: 0.9562 - jacard_coef: 0.5644 - val_loss: 0.7641 - val_mae_euclidean: 6.8939 - val_jacard_coef: 0.1337 Epoch 105/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2759 - mae_euclidean: 0.9926 - jacard_coef: 0.5676 - val_loss: 0.5627 - val_mae_euclidean: 3.7714 - val_jacard_coef: 0.2799 Epoch 106/200 4/4 [==============================] - 2s 394ms/step - loss: 0.2771 - mae_euclidean: 1.0808 - jacard_coef: 0.5661 - val_loss: 0.6174 - val_mae_euclidean: 4.0117 - val_jacard_coef: 0.2366 Epoch 107/200 4/4 [==============================] - 2s 395ms/step - loss: 0.2789 - mae_euclidean: 1.0528 - jacard_coef: 0.5639 - val_loss: 0.5794 - val_mae_euclidean: 3.9134 - val_jacard_coef: 0.2663 Epoch 108/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2783 - mae_euclidean: 0.9322 - jacard_coef: 0.5646 - val_loss: 0.4791 - val_mae_euclidean: 3.0845 - val_jacard_coef: 0.3522 Epoch 109/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2837 - mae_euclidean: 1.0240 - jacard_coef: 0.5582 - val_loss: 0.4696 - val_mae_euclidean: 3.0874 - val_jacard_coef: 0.3609 Epoch 110/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2745 - mae_euclidean: 0.8689 - jacard_coef: 0.5694 - val_loss: 0.4929 - val_mae_euclidean: 3.6633 - val_jacard_coef: 0.3397 Epoch 111/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2686 - mae_euclidean: 0.8693 - jacard_coef: 0.5766 - val_loss: 0.4760 - val_mae_euclidean: 3.2610 - val_jacard_coef: 0.3550 Epoch 112/200 4/4 [==============================] - 2s 394ms/step - loss: 0.2817 - mae_euclidean: 1.0345 - jacard_coef: 0.5614 - val_loss: 0.5073 - val_mae_euclidean: 3.4129 - val_jacard_coef: 0.3269 Epoch 113/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2681 - mae_euclidean: 0.9786 - jacard_coef: 0.5773 - val_loss: 0.5021 - val_mae_euclidean: 3.8260 - val_jacard_coef: 0.3315 Epoch 114/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2725 - mae_euclidean: 1.0968 - jacard_coef: 0.5721 - val_loss: 0.5649 - val_mae_euclidean: 4.1720 - val_jacard_coef: 0.2780 Epoch 115/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2692 - mae_euclidean: 0.9822 - jacard_coef: 0.5760 - val_loss: 0.5403 - val_mae_euclidean: 3.9245 - val_jacard_coef: 0.2984 Epoch 116/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2640 - mae_euclidean: 0.8208 - jacard_coef: 0.5826 - val_loss: 0.4533 - val_mae_euclidean: 2.8338 - val_jacard_coef: 0.3762 Epoch 117/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2660 - mae_euclidean: 1.0085 - jacard_coef: 0.5799 - val_loss: 0.4683 - val_mae_euclidean: 3.1867 - val_jacard_coef: 0.3621 Epoch 118/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2584 - mae_euclidean: 0.8705 - jacard_coef: 0.5893 - val_loss: 0.4873 - val_mae_euclidean: 3.7486 - val_jacard_coef: 0.3448 Epoch 119/200 4/4 [==============================] - 2s 394ms/step - loss: 0.2634 - mae_euclidean: 0.8635 - jacard_coef: 0.5831 - val_loss: 0.4614 - val_mae_euclidean: 2.6766 - val_jacard_coef: 0.3686 Epoch 120/200 4/4 [==============================] - 2s 395ms/step - loss: 0.2575 - mae_euclidean: 0.9955 - jacard_coef: 0.5905 - val_loss: 0.4243 - val_mae_euclidean: 2.2947 - val_jacard_coef: 0.4042 Epoch 121/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2593 - mae_euclidean: 0.9961 - jacard_coef: 0.5883 - val_loss: 0.4943 - val_mae_euclidean: 3.3778 - val_jacard_coef: 0.3384 Epoch 122/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2610 - mae_euclidean: 1.0140 - jacard_coef: 0.5863 - val_loss: 0.4653 - val_mae_euclidean: 2.9439 - val_jacard_coef: 0.3649 Epoch 123/200 4/4 [==============================] - 2s 394ms/step - loss: 0.2762 - mae_euclidean: 1.0773 - jacard_coef: 0.5684 - val_loss: 0.4390 - val_mae_euclidean: 2.8095 - val_jacard_coef: 0.3899 Epoch 124/200 4/4 [==============================] - 2s 395ms/step - loss: 0.2615 - mae_euclidean: 0.9738 - jacard_coef: 0.5855 - val_loss: 0.4885 - val_mae_euclidean: 3.3330 - val_jacard_coef: 0.3436 Epoch 125/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2627 - mae_euclidean: 0.8668 - jacard_coef: 0.5842 - val_loss: 0.3926 - val_mae_euclidean: 2.1221 - val_jacard_coef: 0.4362 Epoch 126/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2779 - mae_euclidean: 1.1360 - jacard_coef: 0.5654 - val_loss: 0.4214 - val_mae_euclidean: 2.0887 - val_jacard_coef: 0.4070 Epoch 127/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2687 - mae_euclidean: 0.9210 - jacard_coef: 0.5764 - val_loss: 0.4068 - val_mae_euclidean: 2.3998 - val_jacard_coef: 0.4217 Epoch 128/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2717 - mae_euclidean: 1.2421 - jacard_coef: 0.5729 - val_loss: 0.3762 - val_mae_euclidean: 1.9979 - val_jacard_coef: 0.4533 Epoch 129/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2652 - mae_euclidean: 0.9622 - jacard_coef: 0.5815 - val_loss: 0.3909 - val_mae_euclidean: 2.2906 - val_jacard_coef: 0.4380 Epoch 130/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2653 - mae_euclidean: 0.8393 - jacard_coef: 0.5808 - val_loss: 0.4518 - val_mae_euclidean: 3.2618 - val_jacard_coef: 0.3776 Epoch 131/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2652 - mae_euclidean: 0.8746 - jacard_coef: 0.5809 - val_loss: 0.3734 - val_mae_euclidean: 2.7005 - val_jacard_coef: 0.4563 Epoch 132/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2713 - mae_euclidean: 0.9143 - jacard_coef: 0.5740 - val_loss: 0.3701 - val_mae_euclidean: 2.3283 - val_jacard_coef: 0.4597 Epoch 133/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2565 - mae_euclidean: 0.9569 - jacard_coef: 0.5918 - val_loss: 0.3286 - val_mae_euclidean: 1.2917 - val_jacard_coef: 0.5054 Epoch 134/200 4/4 [==============================] - 2s 394ms/step - loss: 0.2540 - mae_euclidean: 0.8115 - jacard_coef: 0.5951 - val_loss: 0.3902 - val_mae_euclidean: 1.8641 - val_jacard_coef: 0.4387 Epoch 135/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2579 - mae_euclidean: 0.9089 - jacard_coef: 0.5902 - val_loss: 0.3728 - val_mae_euclidean: 1.9669 - val_jacard_coef: 0.4569 Epoch 136/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2557 - mae_euclidean: 0.9829 - jacard_coef: 0.5929 - val_loss: 0.3660 - val_mae_euclidean: 2.2921 - val_jacard_coef: 0.4641 Epoch 137/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2568 - mae_euclidean: 0.9091 - jacard_coef: 0.5916 - val_loss: 0.3671 - val_mae_euclidean: 2.2014 - val_jacard_coef: 0.4630 Epoch 138/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2628 - mae_euclidean: 1.2207 - jacard_coef: 0.5845 - val_loss: 0.4005 - val_mae_euclidean: 2.4907 - val_jacard_coef: 0.4281 Epoch 139/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2525 - mae_euclidean: 0.7571 - jacard_coef: 0.5968 - val_loss: 0.4099 - val_mae_euclidean: 2.9735 - val_jacard_coef: 0.4186 Epoch 140/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2487 - mae_euclidean: 0.7763 - jacard_coef: 0.6018 - val_loss: 0.3909 - val_mae_euclidean: 2.8168 - val_jacard_coef: 0.4380 Epoch 141/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2593 - mae_euclidean: 1.1573 - jacard_coef: 0.5894 - val_loss: 0.3732 - val_mae_euclidean: 2.2009 - val_jacard_coef: 0.4564 Epoch 142/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2610 - mae_euclidean: 0.9138 - jacard_coef: 0.5861 - val_loss: 0.3541 - val_mae_euclidean: 2.0354 - val_jacard_coef: 0.4770 Epoch 143/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2505 - mae_euclidean: 0.8358 - jacard_coef: 0.5994 - val_loss: 0.3197 - val_mae_euclidean: 1.3492 - val_jacard_coef: 0.5156 Epoch 144/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2562 - mae_euclidean: 0.9712 - jacard_coef: 0.5923 - val_loss: 0.3257 - val_mae_euclidean: 1.3358 - val_jacard_coef: 0.5087 Epoch 145/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2513 - mae_euclidean: 0.9045 - jacard_coef: 0.5984 - val_loss: 0.3124 - val_mae_euclidean: 1.2727 - val_jacard_coef: 0.5240 Epoch 146/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2513 - mae_euclidean: 0.8889 - jacard_coef: 0.5985 - val_loss: 0.3379 - val_mae_euclidean: 1.5403 - val_jacard_coef: 0.4949 Epoch 147/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2499 - mae_euclidean: 0.8484 - jacard_coef: 0.6002 - val_loss: 0.3633 - val_mae_euclidean: 1.8823 - val_jacard_coef: 0.4671 Epoch 148/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2511 - mae_euclidean: 0.9882 - jacard_coef: 0.5987 - val_loss: 0.3710 - val_mae_euclidean: 1.9478 - val_jacard_coef: 0.4588 Epoch 149/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2593 - mae_euclidean: 0.8844 - jacard_coef: 0.5884 - val_loss: 0.3731 - val_mae_euclidean: 1.8557 - val_jacard_coef: 0.4566 Epoch 150/200 4/4 [==============================] - 2s 395ms/step - loss: 0.2544 - mae_euclidean: 1.0956 - jacard_coef: 0.5946 - val_loss: 0.3206 - val_mae_euclidean: 1.3182 - val_jacard_coef: 0.5145 Epoch 151/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2506 - mae_euclidean: 0.9059 - jacard_coef: 0.5993 - val_loss: 0.3923 - val_mae_euclidean: 2.1990 - val_jacard_coef: 0.4364 Epoch 152/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2526 - mae_euclidean: 0.8689 - jacard_coef: 0.5970 - val_loss: 0.3544 - val_mae_euclidean: 1.7805 - val_jacard_coef: 0.4767 Epoch 153/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2570 - mae_euclidean: 0.9347 - jacard_coef: 0.5911 - val_loss: 0.3233 - val_mae_euclidean: 1.7045 - val_jacard_coef: 0.5114 Epoch 154/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2610 - mae_euclidean: 0.8848 - jacard_coef: 0.5863 - val_loss: 0.3034 - val_mae_euclidean: 1.1984 - val_jacard_coef: 0.5345 Epoch 155/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2517 - mae_euclidean: 0.8189 - jacard_coef: 0.5978 - val_loss: 0.3454 - val_mae_euclidean: 1.6555 - val_jacard_coef: 0.4866 Epoch 156/200 4/4 [==============================] - 2s 395ms/step - loss: 0.2484 - mae_euclidean: 0.8350 - jacard_coef: 0.6022 - val_loss: 0.3306 - val_mae_euclidean: 1.6669 - val_jacard_coef: 0.5031 Epoch 157/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2470 - mae_euclidean: 0.8744 - jacard_coef: 0.6040 - val_loss: 0.3205 - val_mae_euclidean: 1.2876 - val_jacard_coef: 0.5146 Epoch 158/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2536 - mae_euclidean: 0.9615 - jacard_coef: 0.5955 - val_loss: 0.3886 - val_mae_euclidean: 2.3787 - val_jacard_coef: 0.4403 Epoch 159/200 4/4 [==============================] - 2s 396ms/step - loss: 0.2428 - mae_euclidean: 0.8251 - jacard_coef: 0.6094 - val_loss: 0.3204 - val_mae_euclidean: 1.3421 - val_jacard_coef: 0.5147 Epoch 160/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2420 - mae_euclidean: 0.9169 - jacard_coef: 0.6103 - val_loss: 0.3461 - val_mae_euclidean: 1.4151 - val_jacard_coef: 0.4858 Epoch 161/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2460 - mae_euclidean: 0.8506 - jacard_coef: 0.6051 - val_loss: 0.3244 - val_mae_euclidean: 1.0882 - val_jacard_coef: 0.5101 Epoch 162/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2416 - mae_euclidean: 0.7970 - jacard_coef: 0.6110 - val_loss: 0.3080 - val_mae_euclidean: 1.1912 - val_jacard_coef: 0.5291 Epoch 163/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2391 - mae_euclidean: 0.7884 - jacard_coef: 0.6142 - val_loss: 0.3610 - val_mae_euclidean: 1.6570 - val_jacard_coef: 0.4696 Epoch 164/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2388 - mae_euclidean: 0.8668 - jacard_coef: 0.6144 - val_loss: 0.3269 - val_mae_euclidean: 1.3472 - val_jacard_coef: 0.5073 Epoch 165/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2391 - mae_euclidean: 0.9445 - jacard_coef: 0.6142 - val_loss: 0.3367 - val_mae_euclidean: 1.7319 - val_jacard_coef: 0.4962 Epoch 166/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2373 - mae_euclidean: 0.7719 - jacard_coef: 0.6166 - val_loss: 0.3293 - val_mae_euclidean: 1.5927 - val_jacard_coef: 0.5046 Epoch 167/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2381 - mae_euclidean: 0.8272 - jacard_coef: 0.6156 - val_loss: 0.3265 - val_mae_euclidean: 1.3512 - val_jacard_coef: 0.5078 Epoch 168/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2402 - mae_euclidean: 0.9558 - jacard_coef: 0.6128 - val_loss: 0.3264 - val_mae_euclidean: 1.6288 - val_jacard_coef: 0.5078 Epoch 169/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2374 - mae_euclidean: 0.8358 - jacard_coef: 0.6163 - val_loss: 0.3783 - val_mae_euclidean: 2.0314 - val_jacard_coef: 0.4511 Epoch 170/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2377 - mae_euclidean: 0.8472 - jacard_coef: 0.6159 - val_loss: 0.3244 - val_mae_euclidean: 1.6126 - val_jacard_coef: 0.5101 Epoch 171/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2385 - mae_euclidean: 0.9862 - jacard_coef: 0.6152 - val_loss: 0.3521 - val_mae_euclidean: 1.8797 - val_jacard_coef: 0.4792 Epoch 172/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2541 - mae_euclidean: 1.0677 - jacard_coef: 0.5948 - val_loss: 0.3288 - val_mae_euclidean: 1.2755 - val_jacard_coef: 0.5052 Epoch 173/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2358 - mae_euclidean: 0.8030 - jacard_coef: 0.6185 - val_loss: 0.3513 - val_mae_euclidean: 1.1357 - val_jacard_coef: 0.4801 Epoch 174/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2349 - mae_euclidean: 0.7714 - jacard_coef: 0.6198 - val_loss: 0.3014 - val_mae_euclidean: 1.1567 - val_jacard_coef: 0.5369 Epoch 175/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2342 - mae_euclidean: 0.8019 - jacard_coef: 0.6206 - val_loss: 0.3204 - val_mae_euclidean: 1.2285 - val_jacard_coef: 0.5147 Epoch 176/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2410 - mae_euclidean: 1.0110 - jacard_coef: 0.6120 - val_loss: 0.3057 - val_mae_euclidean: 1.1025 - val_jacard_coef: 0.5317 Epoch 177/200 4/4 [==============================] - 2s 402ms/step - loss: 0.2415 - mae_euclidean: 0.7881 - jacard_coef: 0.6110 - val_loss: 0.3246 - val_mae_euclidean: 1.2411 - val_jacard_coef: 0.5099 Epoch 178/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2348 - mae_euclidean: 1.0649 - jacard_coef: 0.6198 - val_loss: 0.3039 - val_mae_euclidean: 1.0200 - val_jacard_coef: 0.5338 Epoch 179/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2350 - mae_euclidean: 0.8386 - jacard_coef: 0.6197 - val_loss: 0.2999 - val_mae_euclidean: 1.0664 - val_jacard_coef: 0.5385 Epoch 180/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2362 - mae_euclidean: 0.8652 - jacard_coef: 0.6182 - val_loss: 0.3076 - val_mae_euclidean: 1.2001 - val_jacard_coef: 0.5295 Epoch 181/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2392 - mae_euclidean: 0.8280 - jacard_coef: 0.6141 - val_loss: 0.3293 - val_mae_euclidean: 1.7501 - val_jacard_coef: 0.5045 Epoch 182/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2361 - mae_euclidean: 0.8030 - jacard_coef: 0.6183 - val_loss: 0.3101 - val_mae_euclidean: 1.3649 - val_jacard_coef: 0.5266 Epoch 183/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2474 - mae_euclidean: 0.8315 - jacard_coef: 0.6036 - val_loss: 0.3355 - val_mae_euclidean: 1.3453 - val_jacard_coef: 0.4976 Epoch 184/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2388 - mae_euclidean: 0.7858 - jacard_coef: 0.6147 - val_loss: 0.3346 - val_mae_euclidean: 1.8751 - val_jacard_coef: 0.4986 Epoch 185/200 4/4 [==============================] - 2s 398ms/step - loss: 0.2331 - mae_euclidean: 0.7514 - jacard_coef: 0.6220 - val_loss: 0.2983 - val_mae_euclidean: 1.2966 - val_jacard_coef: 0.5405 Epoch 186/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2339 - mae_euclidean: 0.8246 - jacard_coef: 0.6210 - val_loss: 0.3048 - val_mae_euclidean: 1.2117 - val_jacard_coef: 0.5328 Epoch 187/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2347 - mae_euclidean: 0.9667 - jacard_coef: 0.6199 - val_loss: 0.3003 - val_mae_euclidean: 1.0822 - val_jacard_coef: 0.5381 Epoch 188/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2362 - mae_euclidean: 0.8339 - jacard_coef: 0.6180 - val_loss: 0.3050 - val_mae_euclidean: 1.3327 - val_jacard_coef: 0.5325 Epoch 189/200 4/4 [==============================] - 2s 397ms/step - loss: 0.2321 - mae_euclidean: 0.8541 - jacard_coef: 0.6234 - val_loss: 0.3115 - val_mae_euclidean: 1.2849 - val_jacard_coef: 0.5250 Epoch 190/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2238 - mae_euclidean: 0.7100 - jacard_coef: 0.6344 - val_loss: 0.3100 - val_mae_euclidean: 1.0555 - val_jacard_coef: 0.5267 Epoch 191/200 4/4 [==============================] - 2s 403ms/step - loss: 0.2239 - mae_euclidean: 0.7549 - jacard_coef: 0.6341 - val_loss: 0.3254 - val_mae_euclidean: 1.2957 - val_jacard_coef: 0.5090 Epoch 192/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2274 - mae_euclidean: 0.7543 - jacard_coef: 0.6296 - val_loss: 0.2995 - val_mae_euclidean: 0.9952 - val_jacard_coef: 0.5391 Epoch 193/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2264 - mae_euclidean: 0.7329 - jacard_coef: 0.6308 - val_loss: 0.3195 - val_mae_euclidean: 1.1394 - val_jacard_coef: 0.5158 Epoch 194/200 4/4 [==============================] - 2s 400ms/step - loss: 0.2304 - mae_euclidean: 0.9172 - jacard_coef: 0.6257 - val_loss: 0.3028 - val_mae_euclidean: 0.8840 - val_jacard_coef: 0.5351 Epoch 195/200 4/4 [==============================] - 2s 403ms/step - loss: 0.2469 - mae_euclidean: 1.2037 - jacard_coef: 0.6045 - val_loss: 0.3098 - val_mae_euclidean: 1.1743 - val_jacard_coef: 0.5269 Epoch 196/200 4/4 [==============================] - 2s 401ms/step - loss: 0.2445 - mae_euclidean: 0.9682 - jacard_coef: 0.6074 - val_loss: 0.3127 - val_mae_euclidean: 1.3041 - val_jacard_coef: 0.5236 Epoch 197/200 4/4 [==============================] - 2s 402ms/step - loss: 0.2391 - mae_euclidean: 1.0908 - jacard_coef: 0.6144 - val_loss: 0.3300 - val_mae_euclidean: 1.4195 - val_jacard_coef: 0.5037 Epoch 198/200 4/4 [==============================] - 2s 402ms/step - loss: 0.2315 - mae_euclidean: 1.0427 - jacard_coef: 0.6242 - val_loss: 0.3029 - val_mae_euclidean: 0.9921 - val_jacard_coef: 0.5351 Epoch 199/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2250 - mae_euclidean: 0.8276 - jacard_coef: 0.6328 - val_loss: 0.2970 - val_mae_euclidean: 1.2448 - val_jacard_coef: 0.5420 Epoch 200/200 4/4 [==============================] - 2s 399ms/step - loss: 0.2276 - mae_euclidean: 0.8272 - jacard_coef: 0.6293 - val_loss: 0.2979 - val_mae_euclidean: 1.2925 - val_jacard_coef: 0.5409 Model trained for 316.5363738536835s
C:\Users\a1048794\Anaconda3\lib\site-packages\keras\engine\functional.py:1410: CustomMaskWarning: Custom mask layers require a config and must override get_config. When loading, the custom mask layer must be passed to the custom_objects argument. layer_config = serialize_layer_fn(layer)
plot_history(history_sa_unet1, "SA Unet")
model_sa_unet1.evaluate(X_val, y_val)
1/1 [==============================] - 0s 353ms/step - loss: 0.2979 - mae_euclidean: 1.2925 - jacard_coef: 0.5409
[0.2979312539100647, 1.292516827583313, 0.540920078754425]
# tf.keras.backend.clear_session()
# Attention Unet1
tf.random.set_seed(14)
model_att_unet1 = Attention_UNet(input_shape)
model_att_unet1.compile(optimizer=Adam(learning_rate=1e-2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_att_unet1 = model_att_unet1.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 2,
epochs=50
)
print(f"Model trained for {time.time() - start}s")
model_att_unet1.save("2022-02-17 Att-UNet1 50epochs.hdf5")
Epoch 1/50 16/16 [==============================] - 19s 455ms/step - loss: 0.8100 - mae_euclidean: 4.1651 - jacard_coef: 0.1062 - val_loss: 0.9816 - val_mae_euclidean: 63.0797 - val_jacard_coef: 0.0093 Epoch 2/50 16/16 [==============================] - 7s 418ms/step - loss: 0.7642 - mae_euclidean: 3.8893 - jacard_coef: 0.1349 - val_loss: 0.8609 - val_mae_euclidean: 6.2517 - val_jacard_coef: 0.0759 Epoch 3/50 16/16 [==============================] - 7s 419ms/step - loss: 0.7339 - mae_euclidean: 3.7794 - jacard_coef: 0.1547 - val_loss: 0.9324 - val_mae_euclidean: 57.5636 - val_jacard_coef: 0.0351 Epoch 4/50 16/16 [==============================] - 7s 419ms/step - loss: 0.7081 - mae_euclidean: 3.0676 - jacard_coef: 0.1723 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.8732e-04 Epoch 5/50 16/16 [==============================] - 7s 418ms/step - loss: 0.6804 - mae_euclidean: 2.9842 - jacard_coef: 0.1919 - val_loss: 0.8808 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0638 Epoch 6/50 16/16 [==============================] - 7s 419ms/step - loss: 0.6518 - mae_euclidean: 2.7565 - jacard_coef: 0.2140 - val_loss: 0.9727 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0139 Epoch 7/50 16/16 [==============================] - 7s 420ms/step - loss: 0.6214 - mae_euclidean: 2.3451 - jacard_coef: 0.2370 - val_loss: 0.8643 - val_mae_euclidean: 46.9861 - val_jacard_coef: 0.0736 Epoch 8/50 16/16 [==============================] - 7s 419ms/step - loss: 0.5887 - mae_euclidean: 2.1963 - jacard_coef: 0.2613 - val_loss: 0.8825 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0628 Epoch 9/50 16/16 [==============================] - 7s 422ms/step - loss: 0.5743 - mae_euclidean: 2.3518 - jacard_coef: 0.2745 - val_loss: 0.9922 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0040 Epoch 10/50 16/16 [==============================] - 7s 421ms/step - loss: 0.5328 - mae_euclidean: 1.8690 - jacard_coef: 0.3073 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.1589e-04 Epoch 11/50 16/16 [==============================] - 7s 421ms/step - loss: 0.5231 - mae_euclidean: 1.9893 - jacard_coef: 0.3175 - val_loss: 0.8616 - val_mae_euclidean: 6.3162 - val_jacard_coef: 0.0755 Epoch 12/50 16/16 [==============================] - 7s 422ms/step - loss: 0.5009 - mae_euclidean: 1.9681 - jacard_coef: 0.3369 - val_loss: 0.7520 - val_mae_euclidean: 5.7388 - val_jacard_coef: 0.1441 Epoch 13/50 16/16 [==============================] - 7s 421ms/step - loss: 0.4705 - mae_euclidean: 1.4292 - jacard_coef: 0.3637 - val_loss: 0.8644 - val_mae_euclidean: 11.1895 - val_jacard_coef: 0.0736 Epoch 14/50 16/16 [==============================] - 7s 420ms/step - loss: 0.4530 - mae_euclidean: 1.5741 - jacard_coef: 0.3800 - val_loss: 0.8470 - val_mae_euclidean: 10.3622 - val_jacard_coef: 0.0840 Epoch 15/50 16/16 [==============================] - 7s 418ms/step - loss: 0.4277 - mae_euclidean: 1.5537 - jacard_coef: 0.4033 - val_loss: 0.8496 - val_mae_euclidean: 11.0072 - val_jacard_coef: 0.0822 Epoch 16/50 16/16 [==============================] - 7s 418ms/step - loss: 0.4163 - mae_euclidean: 1.2416 - jacard_coef: 0.4148 - val_loss: 0.8369 - val_mae_euclidean: 26.9060 - val_jacard_coef: 0.0926 Epoch 17/50 16/16 [==============================] - 7s 415ms/step - loss: 0.4033 - mae_euclidean: 1.4298 - jacard_coef: 0.4280 - val_loss: 0.9986 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.0801e-04 Epoch 18/50 16/16 [==============================] - 7s 416ms/step - loss: 0.4008 - mae_euclidean: 1.2758 - jacard_coef: 0.4300 - val_loss: 0.7208 - val_mae_euclidean: 7.6967 - val_jacard_coef: 0.1660 Epoch 19/50 16/16 [==============================] - 7s 416ms/step - loss: 0.3982 - mae_euclidean: 1.6878 - jacard_coef: 0.4341 - val_loss: 0.6972 - val_mae_euclidean: 5.2524 - val_jacard_coef: 0.1828 Epoch 20/50 16/16 [==============================] - 7s 415ms/step - loss: 0.3826 - mae_euclidean: 1.2828 - jacard_coef: 0.4486 - val_loss: 0.7442 - val_mae_euclidean: 6.2267 - val_jacard_coef: 0.1487 Epoch 21/50 16/16 [==============================] - 7s 415ms/step - loss: 0.3743 - mae_euclidean: 1.1040 - jacard_coef: 0.4577 - val_loss: 0.6309 - val_mae_euclidean: 4.6526 - val_jacard_coef: 0.2338 Epoch 22/50 16/16 [==============================] - 7s 416ms/step - loss: 0.3643 - mae_euclidean: 1.3188 - jacard_coef: 0.4689 - val_loss: 0.6393 - val_mae_euclidean: 3.5176 - val_jacard_coef: 0.2248 Epoch 23/50 16/16 [==============================] - 7s 418ms/step - loss: 0.3664 - mae_euclidean: 1.0332 - jacard_coef: 0.4659 - val_loss: 0.6965 - val_mae_euclidean: 8.9686 - val_jacard_coef: 0.1855 Epoch 24/50 16/16 [==============================] - 7s 419ms/step - loss: 0.3462 - mae_euclidean: 1.0379 - jacard_coef: 0.4883 - val_loss: 0.8356 - val_mae_euclidean: 14.9888 - val_jacard_coef: 0.0912 Epoch 25/50 16/16 [==============================] - 7s 420ms/step - loss: 0.3313 - mae_euclidean: 1.0257 - jacard_coef: 0.5041 - val_loss: 0.6442 - val_mae_euclidean: 3.5344 - val_jacard_coef: 0.2326 Epoch 26/50 16/16 [==============================] - 7s 422ms/step - loss: 0.3291 - mae_euclidean: 1.0188 - jacard_coef: 0.5064 - val_loss: 0.4983 - val_mae_euclidean: 4.0402 - val_jacard_coef: 0.3401 Epoch 27/50 16/16 [==============================] - 7s 421ms/step - loss: 0.3368 - mae_euclidean: 1.0869 - jacard_coef: 0.4984 - val_loss: 0.4648 - val_mae_euclidean: 2.6472 - val_jacard_coef: 0.3704 Epoch 28/50 16/16 [==============================] - 7s 422ms/step - loss: 0.3384 - mae_euclidean: 1.0450 - jacard_coef: 0.4975 - val_loss: 0.5649 - val_mae_euclidean: 3.9019 - val_jacard_coef: 0.2809 Epoch 29/50 16/16 [==============================] - 7s 420ms/step - loss: 0.3244 - mae_euclidean: 1.0500 - jacard_coef: 0.5123 - val_loss: 0.4317 - val_mae_euclidean: 2.1008 - val_jacard_coef: 0.4006 Epoch 30/50 16/16 [==============================] - 7s 420ms/step - loss: 0.3072 - mae_euclidean: 0.9288 - jacard_coef: 0.5317 - val_loss: 0.4683 - val_mae_euclidean: 2.6413 - val_jacard_coef: 0.3656 Epoch 31/50 16/16 [==============================] - 7s 424ms/step - loss: 0.3118 - mae_euclidean: 0.9120 - jacard_coef: 0.5273 - val_loss: 0.4665 - val_mae_euclidean: 3.2884 - val_jacard_coef: 0.3688 Epoch 32/50 16/16 [==============================] - 7s 421ms/step - loss: 0.3044 - mae_euclidean: 0.8473 - jacard_coef: 0.5352 - val_loss: 0.4244 - val_mae_euclidean: 2.4463 - val_jacard_coef: 0.4088 Epoch 33/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2948 - mae_euclidean: 0.8533 - jacard_coef: 0.5455 - val_loss: 0.4399 - val_mae_euclidean: 2.6703 - val_jacard_coef: 0.3932 Epoch 34/50 16/16 [==============================] - 7s 421ms/step - loss: 0.2881 - mae_euclidean: 0.8028 - jacard_coef: 0.5540 - val_loss: 0.5154 - val_mae_euclidean: 3.5347 - val_jacard_coef: 0.3210 Epoch 35/50 16/16 [==============================] - 7s 422ms/step - loss: 0.3180 - mae_euclidean: 1.0144 - jacard_coef: 0.5196 - val_loss: 0.3878 - val_mae_euclidean: 2.1736 - val_jacard_coef: 0.4428 Epoch 36/50 16/16 [==============================] - 7s 422ms/step - loss: 0.3116 - mae_euclidean: 1.1857 - jacard_coef: 0.5273 - val_loss: 0.3779 - val_mae_euclidean: 1.5760 - val_jacard_coef: 0.4540 Epoch 37/50 16/16 [==============================] - 7s 421ms/step - loss: 0.2887 - mae_euclidean: 0.8191 - jacard_coef: 0.5538 - val_loss: 0.3493 - val_mae_euclidean: 1.3364 - val_jacard_coef: 0.4842 Epoch 38/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2879 - mae_euclidean: 0.8351 - jacard_coef: 0.5550 - val_loss: 0.3724 - val_mae_euclidean: 2.0884 - val_jacard_coef: 0.4595 Epoch 39/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2763 - mae_euclidean: 0.7798 - jacard_coef: 0.5688 - val_loss: 0.3388 - val_mae_euclidean: 1.1190 - val_jacard_coef: 0.4954 Epoch 40/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2610 - mae_euclidean: 0.7614 - jacard_coef: 0.5872 - val_loss: 0.3519 - val_mae_euclidean: 1.6819 - val_jacard_coef: 0.4809 Epoch 41/50 16/16 [==============================] - 7s 421ms/step - loss: 0.2694 - mae_euclidean: 0.7487 - jacard_coef: 0.5773 - val_loss: 0.3585 - val_mae_euclidean: 2.3875 - val_jacard_coef: 0.4733 Epoch 42/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2782 - mae_euclidean: 0.7625 - jacard_coef: 0.5658 - val_loss: 0.4312 - val_mae_euclidean: 3.0479 - val_jacard_coef: 0.3982 Epoch 43/50 16/16 [==============================] - 7s 423ms/step - loss: 0.2548 - mae_euclidean: 0.8217 - jacard_coef: 0.5949 - val_loss: 0.3958 - val_mae_euclidean: 2.0207 - val_jacard_coef: 0.4332 Epoch 44/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2557 - mae_euclidean: 0.6456 - jacard_coef: 0.5945 - val_loss: 0.3771 - val_mae_euclidean: 1.4731 - val_jacard_coef: 0.4542 Epoch 45/50 16/16 [==============================] - 7s 423ms/step - loss: 0.2494 - mae_euclidean: 0.6942 - jacard_coef: 0.6019 - val_loss: 0.3663 - val_mae_euclidean: 1.3279 - val_jacard_coef: 0.4660 Epoch 46/50 16/16 [==============================] - 7s 422ms/step - loss: 0.2599 - mae_euclidean: 0.6731 - jacard_coef: 0.5893 - val_loss: 0.3355 - val_mae_euclidean: 1.2177 - val_jacard_coef: 0.4994 Epoch 47/50 16/16 [==============================] - 7s 420ms/step - loss: 0.2485 - mae_euclidean: 0.6693 - jacard_coef: 0.6029 - val_loss: 0.3397 - val_mae_euclidean: 1.2132 - val_jacard_coef: 0.4945 Epoch 48/50 16/16 [==============================] - 7s 420ms/step - loss: 0.2569 - mae_euclidean: 0.7965 - jacard_coef: 0.5932 - val_loss: 0.4101 - val_mae_euclidean: 1.5252 - val_jacard_coef: 0.4190 Epoch 49/50 16/16 [==============================] - 7s 420ms/step - loss: 0.2267 - mae_euclidean: 0.6230 - jacard_coef: 0.6309 - val_loss: 0.3851 - val_mae_euclidean: 1.3435 - val_jacard_coef: 0.4451 Epoch 50/50 16/16 [==============================] - 7s 420ms/step - loss: 0.2231 - mae_euclidean: 0.6283 - jacard_coef: 0.6359 - val_loss: 0.3376 - val_mae_euclidean: 1.2765 - val_jacard_coef: 0.4964 Model trained for 347.18052101135254s
plot_history(history_att_unet1, "Attention Unet")
# additional trainning with smaller learning rate
model_att_unet1.compile(optimizer=Adam(learning_rate=1e-3),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_att_unet1a = model_att_unet1.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 2,
epochs=25
)
print(f"Model trained for {time.time() - start}s")
model_att_unet1.save("2022-02-17 Att-UNet1a 50+25epochs.hdf5")
Epoch 1/25 16/16 [==============================] - 11s 453ms/step - loss: 0.2119 - mae_euclidean: 0.5784 - jacard_coef: 0.6516 - val_loss: 0.3437 - val_mae_euclidean: 1.2986 - val_jacard_coef: 0.4897 Epoch 2/25 16/16 [==============================] - 7s 415ms/step - loss: 0.1986 - mae_euclidean: 0.5058 - jacard_coef: 0.6698 - val_loss: 0.3321 - val_mae_euclidean: 1.2732 - val_jacard_coef: 0.5026 Epoch 3/25 16/16 [==============================] - 7s 417ms/step - loss: 0.1919 - mae_euclidean: 0.5179 - jacard_coef: 0.6790 - val_loss: 0.3273 - val_mae_euclidean: 1.2761 - val_jacard_coef: 0.5082 Epoch 4/25 16/16 [==============================] - 7s 419ms/step - loss: 0.1882 - mae_euclidean: 0.5066 - jacard_coef: 0.6842 - val_loss: 0.3243 - val_mae_euclidean: 1.1886 - val_jacard_coef: 0.5121 Epoch 5/25 16/16 [==============================] - 7s 419ms/step - loss: 0.1818 - mae_euclidean: 0.4620 - jacard_coef: 0.6929 - val_loss: 0.3299 - val_mae_euclidean: 1.2154 - val_jacard_coef: 0.5055 Epoch 6/25 16/16 [==============================] - 7s 419ms/step - loss: 0.1839 - mae_euclidean: 0.4857 - jacard_coef: 0.6902 - val_loss: 0.3281 - val_mae_euclidean: 1.1718 - val_jacard_coef: 0.5078 Epoch 7/25 16/16 [==============================] - 7s 418ms/step - loss: 0.1838 - mae_euclidean: 0.4634 - jacard_coef: 0.6907 - val_loss: 0.3285 - val_mae_euclidean: 1.1783 - val_jacard_coef: 0.5073 Epoch 8/25 16/16 [==============================] - 7s 418ms/step - loss: 0.1746 - mae_euclidean: 0.4565 - jacard_coef: 0.7035 - val_loss: 0.3233 - val_mae_euclidean: 1.1359 - val_jacard_coef: 0.5133 Epoch 9/25 16/16 [==============================] - 7s 420ms/step - loss: 0.1786 - mae_euclidean: 0.4360 - jacard_coef: 0.6979 - val_loss: 0.3256 - val_mae_euclidean: 1.0935 - val_jacard_coef: 0.5110 Epoch 10/25 16/16 [==============================] - 7s 421ms/step - loss: 0.1689 - mae_euclidean: 0.4506 - jacard_coef: 0.7115 - val_loss: 0.3277 - val_mae_euclidean: 1.1710 - val_jacard_coef: 0.5086 Epoch 11/25 16/16 [==============================] - 7s 420ms/step - loss: 0.1716 - mae_euclidean: 0.4184 - jacard_coef: 0.7079 - val_loss: 0.3273 - val_mae_euclidean: 1.1730 - val_jacard_coef: 0.5090 Epoch 12/25 16/16 [==============================] - 7s 424ms/step - loss: 0.1668 - mae_euclidean: 0.4064 - jacard_coef: 0.7148 - val_loss: 0.3276 - val_mae_euclidean: 1.1616 - val_jacard_coef: 0.5091 Epoch 13/25 16/16 [==============================] - 7s 437ms/step - loss: 0.1658 - mae_euclidean: 0.4019 - jacard_coef: 0.7164 - val_loss: 0.3226 - val_mae_euclidean: 1.0921 - val_jacard_coef: 0.5140 Epoch 14/25 16/16 [==============================] - 7s 433ms/step - loss: 0.1636 - mae_euclidean: 0.4279 - jacard_coef: 0.7196 - val_loss: 0.3250 - val_mae_euclidean: 1.1222 - val_jacard_coef: 0.5113 Epoch 15/25 16/16 [==============================] - 7s 428ms/step - loss: 0.1594 - mae_euclidean: 0.3838 - jacard_coef: 0.7256 - val_loss: 0.3259 - val_mae_euclidean: 1.1865 - val_jacard_coef: 0.5108 Epoch 16/25 16/16 [==============================] - 7s 423ms/step - loss: 0.1563 - mae_euclidean: 0.3965 - jacard_coef: 0.7303 - val_loss: 0.3243 - val_mae_euclidean: 1.2000 - val_jacard_coef: 0.5126 Epoch 17/25 16/16 [==============================] - 7s 420ms/step - loss: 0.1562 - mae_euclidean: 0.4186 - jacard_coef: 0.7304 - val_loss: 0.3274 - val_mae_euclidean: 1.2233 - val_jacard_coef: 0.5092 Epoch 18/25 16/16 [==============================] - 7s 419ms/step - loss: 0.1524 - mae_euclidean: 0.4166 - jacard_coef: 0.7362 - val_loss: 0.3206 - val_mae_euclidean: 1.1105 - val_jacard_coef: 0.5166 Epoch 19/25 16/16 [==============================] - 7s 418ms/step - loss: 0.1519 - mae_euclidean: 0.3887 - jacard_coef: 0.7368 - val_loss: 0.3273 - val_mae_euclidean: 1.1641 - val_jacard_coef: 0.5093 Epoch 20/25 16/16 [==============================] - 7s 417ms/step - loss: 0.1462 - mae_euclidean: 0.3764 - jacard_coef: 0.7453 - val_loss: 0.3274 - val_mae_euclidean: 1.1778 - val_jacard_coef: 0.5094 Epoch 21/25 16/16 [==============================] - 7s 411ms/step - loss: 0.1479 - mae_euclidean: 0.3788 - jacard_coef: 0.7429 - val_loss: 0.3239 - val_mae_euclidean: 1.1453 - val_jacard_coef: 0.5132 Epoch 22/25 16/16 [==============================] - 7s 408ms/step - loss: 0.1427 - mae_euclidean: 0.3457 - jacard_coef: 0.7508 - val_loss: 0.3273 - val_mae_euclidean: 1.1438 - val_jacard_coef: 0.5092 Epoch 23/25 16/16 [==============================] - 7s 409ms/step - loss: 0.1431 - mae_euclidean: 0.3631 - jacard_coef: 0.7501 - val_loss: 0.3264 - val_mae_euclidean: 1.1133 - val_jacard_coef: 0.5099 Epoch 24/25 16/16 [==============================] - 7s 409ms/step - loss: 0.1392 - mae_euclidean: 0.3698 - jacard_coef: 0.7561 - val_loss: 0.3241 - val_mae_euclidean: 1.1641 - val_jacard_coef: 0.5126 Epoch 25/25 16/16 [==============================] - 6s 407ms/step - loss: 0.1367 - mae_euclidean: 0.3588 - jacard_coef: 0.7601 - val_loss: 0.3246 - val_mae_euclidean: 1.1399 - val_jacard_coef: 0.5120 Model trained for 171.23888444900513s
plot_history(history_att_unet1a, "Att Unet additional trainning")
model_att_unet1.evaluate(X_val, y_val)
1/1 [==============================] - 7s 7s/step - loss: 0.3036 - mae_euclidean: 1.1399 - jacard_coef: 0.5342
[0.3036247491836548, 1.139896035194397, 0.5341904759407043]
# tf.keras.backend.clear_session()
# Classic Unet1
tf.random.set_seed(14)
model_unet1 = UNet(input_shape)
model_unet1.compile(optimizer=Adam(learning_rate=1e-2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_unet1 = model_unet1.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 4,
epochs=200
)
print(f"Model trained for {time.time() - start}s")
model_unet1.save("2022-02-17 UNet1 200epochs.hdf5")
Epoch 1/200 8/8 [==============================] - 13s 689ms/step - loss: 0.8266 - mae_euclidean: 4.5184 - jacard_coef: 0.0958 - val_loss: 0.8604 - val_mae_euclidean: 6.2757 - val_jacard_coef: 0.0761 Epoch 2/200 8/8 [==============================] - 5s 617ms/step - loss: 0.7856 - mae_euclidean: 3.6526 - jacard_coef: 0.1209 - val_loss: 0.8619 - val_mae_euclidean: 6.3189 - val_jacard_coef: 0.0753 Epoch 3/200 8/8 [==============================] - 5s 614ms/step - loss: 0.7602 - mae_euclidean: 3.3592 - jacard_coef: 0.1364 - val_loss: 0.8619 - val_mae_euclidean: 6.3222 - val_jacard_coef: 0.0753 Epoch 4/200 8/8 [==============================] - 5s 617ms/step - loss: 0.7461 - mae_euclidean: 3.4805 - jacard_coef: 0.1458 - val_loss: 0.9373 - val_mae_euclidean: 56.7043 - val_jacard_coef: 0.0326 Epoch 5/200 8/8 [==============================] - 5s 617ms/step - loss: 0.7336 - mae_euclidean: 3.3536 - jacard_coef: 0.1546 - val_loss: 0.9834 - val_mae_euclidean: 62.6911 - val_jacard_coef: 0.0084 Epoch 6/200 8/8 [==============================] - 5s 616ms/step - loss: 0.7218 - mae_euclidean: 3.4944 - jacard_coef: 0.1632 - val_loss: 0.9965 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0018 Epoch 7/200 8/8 [==============================] - 5s 614ms/step - loss: 0.7044 - mae_euclidean: 3.0575 - jacard_coef: 0.1748 - val_loss: 0.9668 - val_mae_euclidean: 43.1846 - val_jacard_coef: 0.0170 Epoch 8/200 8/8 [==============================] - 5s 616ms/step - loss: 0.6867 - mae_euclidean: 2.8729 - jacard_coef: 0.1869 - val_loss: 0.8828 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 0.0625 Epoch 9/200 8/8 [==============================] - 5s 616ms/step - loss: 0.6800 - mae_euclidean: 2.7929 - jacard_coef: 0.1924 - val_loss: 0.9488 - val_mae_euclidean: 79.7474 - val_jacard_coef: 0.0263 Epoch 10/200 8/8 [==============================] - 5s 615ms/step - loss: 0.6543 - mae_euclidean: 2.4537 - jacard_coef: 0.2097 - val_loss: 0.8906 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0582 Epoch 11/200 8/8 [==============================] - 5s 616ms/step - loss: 0.6454 - mae_euclidean: 2.5071 - jacard_coef: 0.2179 - val_loss: 0.8957 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0553 Epoch 12/200 8/8 [==============================] - 5s 616ms/step - loss: 0.6190 - mae_euclidean: 2.3996 - jacard_coef: 0.2361 - val_loss: 0.9022 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0517 Epoch 13/200 8/8 [==============================] - 5s 616ms/step - loss: 0.6142 - mae_euclidean: 2.2077 - jacard_coef: 0.2407 - val_loss: 0.8736 - val_mae_euclidean: 28.7642 - val_jacard_coef: 0.0681 Epoch 14/200 8/8 [==============================] - 5s 616ms/step - loss: 0.5975 - mae_euclidean: 2.1766 - jacard_coef: 0.2539 - val_loss: 0.9451 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0283 Epoch 15/200 8/8 [==============================] - 5s 614ms/step - loss: 0.5729 - mae_euclidean: 2.0772 - jacard_coef: 0.2723 - val_loss: 0.9203 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0417 Epoch 16/200 8/8 [==============================] - 5s 613ms/step - loss: 0.5632 - mae_euclidean: 2.0378 - jacard_coef: 0.2808 - val_loss: 0.9180 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0430 Epoch 17/200 8/8 [==============================] - 5s 613ms/step - loss: 0.5457 - mae_euclidean: 1.8294 - jacard_coef: 0.2945 - val_loss: 0.9230 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0401 Epoch 18/200 8/8 [==============================] - 5s 611ms/step - loss: 0.5322 - mae_euclidean: 1.8024 - jacard_coef: 0.3072 - val_loss: 0.9277 - val_mae_euclidean: 13835056955770535936.0000 - val_jacard_coef: 0.0376 Epoch 19/200 8/8 [==============================] - 5s 612ms/step - loss: 0.5234 - mae_euclidean: 2.0947 - jacard_coef: 0.3155 - val_loss: 0.9293 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0368 Epoch 20/200 8/8 [==============================] - 5s 613ms/step - loss: 0.5140 - mae_euclidean: 1.6603 - jacard_coef: 0.3229 - val_loss: 0.9304 - val_mae_euclidean: 57.2193 - val_jacard_coef: 0.0362 Epoch 21/200 8/8 [==============================] - 5s 612ms/step - loss: 0.5026 - mae_euclidean: 1.6695 - jacard_coef: 0.3331 - val_loss: 0.9323 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0351 Epoch 22/200 8/8 [==============================] - 5s 609ms/step - loss: 0.4852 - mae_euclidean: 1.6356 - jacard_coef: 0.3495 - val_loss: 0.9322 - val_mae_euclidean: 58.2939 - val_jacard_coef: 0.0352 Epoch 23/200 8/8 [==============================] - 5s 609ms/step - loss: 0.4768 - mae_euclidean: 1.6417 - jacard_coef: 0.3572 - val_loss: 0.9396 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 0.0312 Epoch 24/200 8/8 [==============================] - 5s 610ms/step - loss: 0.4501 - mae_euclidean: 1.3466 - jacard_coef: 0.3798 - val_loss: 0.9438 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0290 Epoch 25/200 8/8 [==============================] - 5s 612ms/step - loss: 0.4425 - mae_euclidean: 1.4457 - jacard_coef: 0.3875 - val_loss: 0.9435 - val_mae_euclidean: 57.6788 - val_jacard_coef: 0.0291 Epoch 26/200 8/8 [==============================] - 5s 612ms/step - loss: 0.4302 - mae_euclidean: 1.2779 - jacard_coef: 0.3994 - val_loss: 0.9467 - val_mae_euclidean: 52.1623 - val_jacard_coef: 0.0274 Epoch 27/200 8/8 [==============================] - 5s 611ms/step - loss: 0.4294 - mae_euclidean: 1.5023 - jacard_coef: 0.4031 - val_loss: 0.9487 - val_mae_euclidean: 52.1767 - val_jacard_coef: 0.0263 Epoch 28/200 8/8 [==============================] - 5s 611ms/step - loss: 0.4493 - mae_euclidean: 1.8342 - jacard_coef: 0.3835 - val_loss: 0.9129 - val_mae_euclidean: 21.1828 - val_jacard_coef: 0.0455 Epoch 29/200 8/8 [==============================] - 5s 610ms/step - loss: 0.4194 - mae_euclidean: 1.1538 - jacard_coef: 0.4095 - val_loss: 0.9531 - val_mae_euclidean: 48.6527 - val_jacard_coef: 0.0240 Epoch 30/200 8/8 [==============================] - 5s 611ms/step - loss: 0.4155 - mae_euclidean: 1.2992 - jacard_coef: 0.4144 - val_loss: 0.9425 - val_mae_euclidean: 39.7856 - val_jacard_coef: 0.0296 Epoch 31/200 8/8 [==============================] - 5s 607ms/step - loss: 0.4030 - mae_euclidean: 1.1558 - jacard_coef: 0.4263 - val_loss: 0.9350 - val_mae_euclidean: 18.5468 - val_jacard_coef: 0.0336 Epoch 32/200 8/8 [==============================] - 5s 607ms/step - loss: 0.4041 - mae_euclidean: 1.3065 - jacard_coef: 0.4259 - val_loss: 0.9333 - val_mae_euclidean: 20.3219 - val_jacard_coef: 0.0345 Epoch 33/200 8/8 [==============================] - 5s 609ms/step - loss: 0.4023 - mae_euclidean: 1.1486 - jacard_coef: 0.4272 - val_loss: 0.9007 - val_mae_euclidean: 11.5399 - val_jacard_coef: 0.0524 Epoch 34/200 8/8 [==============================] - 5s 607ms/step - loss: 0.3916 - mae_euclidean: 1.1199 - jacard_coef: 0.4383 - val_loss: 0.9158 - val_mae_euclidean: 15.8605 - val_jacard_coef: 0.0440 Epoch 35/200 8/8 [==============================] - 5s 607ms/step - loss: 0.4012 - mae_euclidean: 1.6328 - jacard_coef: 0.4306 - val_loss: 0.9354 - val_mae_euclidean: 15.8822 - val_jacard_coef: 0.0334 Epoch 36/200 8/8 [==============================] - 5s 606ms/step - loss: 0.3704 - mae_euclidean: 1.0919 - jacard_coef: 0.4604 - val_loss: 0.9263 - val_mae_euclidean: 18.4639 - val_jacard_coef: 0.0383 Epoch 37/200 8/8 [==============================] - 5s 608ms/step - loss: 0.3603 - mae_euclidean: 1.0440 - jacard_coef: 0.4708 - val_loss: 0.8796 - val_mae_euclidean: 10.0463 - val_jacard_coef: 0.0641 Epoch 38/200 8/8 [==============================] - 5s 608ms/step - loss: 0.3699 - mae_euclidean: 1.1810 - jacard_coef: 0.4641 - val_loss: 0.9018 - val_mae_euclidean: 11.5337 - val_jacard_coef: 0.0516 Epoch 39/200 8/8 [==============================] - 5s 608ms/step - loss: 0.3455 - mae_euclidean: 0.9330 - jacard_coef: 0.4878 - val_loss: 0.8898 - val_mae_euclidean: 10.9618 - val_jacard_coef: 0.0583 Epoch 40/200 8/8 [==============================] - 5s 609ms/step - loss: 0.3367 - mae_euclidean: 0.9277 - jacard_coef: 0.4967 - val_loss: 0.8282 - val_mae_euclidean: 7.1753 - val_jacard_coef: 0.0942 Epoch 41/200 8/8 [==============================] - 5s 607ms/step - loss: 0.3415 - mae_euclidean: 0.9575 - jacard_coef: 0.4916 - val_loss: 0.8777 - val_mae_euclidean: 8.5606 - val_jacard_coef: 0.0652 Epoch 42/200 8/8 [==============================] - 5s 609ms/step - loss: 0.3422 - mae_euclidean: 1.1084 - jacard_coef: 0.4911 - val_loss: 0.8497 - val_mae_euclidean: 8.1112 - val_jacard_coef: 0.0813 Epoch 43/200 8/8 [==============================] - 5s 609ms/step - loss: 0.3271 - mae_euclidean: 0.9384 - jacard_coef: 0.5079 - val_loss: 0.8187 - val_mae_euclidean: 7.2131 - val_jacard_coef: 0.0998 Epoch 44/200 8/8 [==============================] - 5s 609ms/step - loss: 0.3279 - mae_euclidean: 1.0506 - jacard_coef: 0.5073 - val_loss: 0.7975 - val_mae_euclidean: 6.3796 - val_jacard_coef: 0.1129 Epoch 45/200 8/8 [==============================] - 5s 608ms/step - loss: 0.3303 - mae_euclidean: 0.9820 - jacard_coef: 0.5056 - val_loss: 0.8020 - val_mae_euclidean: 6.1947 - val_jacard_coef: 0.1102 Epoch 46/200 8/8 [==============================] - 5s 610ms/step - loss: 0.3312 - mae_euclidean: 1.0105 - jacard_coef: 0.5044 - val_loss: 0.7128 - val_mae_euclidean: 5.7995 - val_jacard_coef: 0.1678 Epoch 47/200 8/8 [==============================] - 5s 608ms/step - loss: 0.3162 - mae_euclidean: 0.9692 - jacard_coef: 0.5204 - val_loss: 0.7246 - val_mae_euclidean: 5.7926 - val_jacard_coef: 0.1599 Epoch 48/200 8/8 [==============================] - 5s 610ms/step - loss: 0.3025 - mae_euclidean: 0.9358 - jacard_coef: 0.5362 - val_loss: 0.6993 - val_mae_euclidean: 5.3684 - val_jacard_coef: 0.1779 Epoch 49/200 8/8 [==============================] - 5s 611ms/step - loss: 0.2893 - mae_euclidean: 0.8107 - jacard_coef: 0.5517 - val_loss: 0.7214 - val_mae_euclidean: 4.9237 - val_jacard_coef: 0.1625 Epoch 50/200 8/8 [==============================] - 5s 614ms/step - loss: 0.2921 - mae_euclidean: 0.8298 - jacard_coef: 0.5485 - val_loss: 0.7943 - val_mae_euclidean: 6.2637 - val_jacard_coef: 0.1150 Epoch 51/200 8/8 [==============================] - 5s 611ms/step - loss: 0.3021 - mae_euclidean: 0.9762 - jacard_coef: 0.5364 - val_loss: 0.6669 - val_mae_euclidean: 4.9943 - val_jacard_coef: 0.2002 Epoch 52/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2966 - mae_euclidean: 0.8238 - jacard_coef: 0.5432 - val_loss: 0.6788 - val_mae_euclidean: 4.6027 - val_jacard_coef: 0.1926 Epoch 53/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2825 - mae_euclidean: 0.7762 - jacard_coef: 0.5598 - val_loss: 0.6395 - val_mae_euclidean: 3.8339 - val_jacard_coef: 0.2218 Epoch 54/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2741 - mae_euclidean: 0.7314 - jacard_coef: 0.5700 - val_loss: 0.5702 - val_mae_euclidean: 3.8825 - val_jacard_coef: 0.2748 Epoch 55/200 8/8 [==============================] - 5s 614ms/step - loss: 0.2766 - mae_euclidean: 0.8394 - jacard_coef: 0.5685 - val_loss: 0.6874 - val_mae_euclidean: 5.1167 - val_jacard_coef: 0.1859 Epoch 56/200 8/8 [==============================] - 5s 610ms/step - loss: 0.2881 - mae_euclidean: 0.8652 - jacard_coef: 0.5556 - val_loss: 0.5607 - val_mae_euclidean: 3.5050 - val_jacard_coef: 0.2833 Epoch 57/200 8/8 [==============================] - 5s 610ms/step - loss: 0.2602 - mae_euclidean: 0.8202 - jacard_coef: 0.5875 - val_loss: 0.5213 - val_mae_euclidean: 3.2169 - val_jacard_coef: 0.3185 Epoch 58/200 8/8 [==============================] - 5s 610ms/step - loss: 0.2549 - mae_euclidean: 0.8654 - jacard_coef: 0.5941 - val_loss: 0.5373 - val_mae_euclidean: 3.2261 - val_jacard_coef: 0.3044 Epoch 59/200 8/8 [==============================] - 5s 608ms/step - loss: 0.2489 - mae_euclidean: 0.8006 - jacard_coef: 0.6017 - val_loss: 0.6149 - val_mae_euclidean: 3.8476 - val_jacard_coef: 0.2397 Epoch 60/200 8/8 [==============================] - 5s 610ms/step - loss: 0.2652 - mae_euclidean: 1.0125 - jacard_coef: 0.5819 - val_loss: 0.4978 - val_mae_euclidean: 3.0929 - val_jacard_coef: 0.3366 Epoch 61/200 8/8 [==============================] - 5s 610ms/step - loss: 0.2516 - mae_euclidean: 0.7029 - jacard_coef: 0.5981 - val_loss: 0.5004 - val_mae_euclidean: 3.1981 - val_jacard_coef: 0.3374 Epoch 62/200 8/8 [==============================] - 5s 608ms/step - loss: 0.2396 - mae_euclidean: 0.7570 - jacard_coef: 0.6141 - val_loss: 0.5128 - val_mae_euclidean: 3.3195 - val_jacard_coef: 0.3231 Epoch 63/200 8/8 [==============================] - 5s 609ms/step - loss: 0.2755 - mae_euclidean: 0.7792 - jacard_coef: 0.5692 - val_loss: 0.4893 - val_mae_euclidean: 3.1399 - val_jacard_coef: 0.3451 Epoch 64/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2848 - mae_euclidean: 0.9411 - jacard_coef: 0.5589 - val_loss: 0.5018 - val_mae_euclidean: 3.1243 - val_jacard_coef: 0.3337 Epoch 65/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2541 - mae_euclidean: 0.9402 - jacard_coef: 0.5969 - val_loss: 0.4383 - val_mae_euclidean: 2.8092 - val_jacard_coef: 0.3927 Epoch 66/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2667 - mae_euclidean: 0.8846 - jacard_coef: 0.5806 - val_loss: 0.4016 - val_mae_euclidean: 2.0892 - val_jacard_coef: 0.4298 Epoch 67/200 8/8 [==============================] - 5s 611ms/step - loss: 0.2512 - mae_euclidean: 0.7424 - jacard_coef: 0.5990 - val_loss: 0.4008 - val_mae_euclidean: 2.1738 - val_jacard_coef: 0.4296 Epoch 68/200 8/8 [==============================] - 5s 612ms/step - loss: 0.2395 - mae_euclidean: 0.7713 - jacard_coef: 0.6144 - val_loss: 0.3528 - val_mae_euclidean: 1.4928 - val_jacard_coef: 0.4808 Epoch 69/200 8/8 [==============================] - 5s 611ms/step - loss: 0.2369 - mae_euclidean: 0.7496 - jacard_coef: 0.6177 - val_loss: 0.3529 - val_mae_euclidean: 1.3945 - val_jacard_coef: 0.4803 Epoch 70/200 8/8 [==============================] - 5s 609ms/step - loss: 0.2213 - mae_euclidean: 0.7163 - jacard_coef: 0.6380 - val_loss: 0.4293 - val_mae_euclidean: 2.6484 - val_jacard_coef: 0.4009 Epoch 71/200 8/8 [==============================] - 5s 607ms/step - loss: 0.2433 - mae_euclidean: 0.8074 - jacard_coef: 0.6104 - val_loss: 0.3815 - val_mae_euclidean: 2.1519 - val_jacard_coef: 0.4492 Epoch 72/200 8/8 [==============================] - 5s 609ms/step - loss: 0.2133 - mae_euclidean: 0.6500 - jacard_coef: 0.6487 - val_loss: 0.3675 - val_mae_euclidean: 1.6960 - val_jacard_coef: 0.4638 Epoch 73/200 8/8 [==============================] - 5s 609ms/step - loss: 0.2170 - mae_euclidean: 0.6736 - jacard_coef: 0.6439 - val_loss: 0.3878 - val_mae_euclidean: 1.3345 - val_jacard_coef: 0.4446 Epoch 74/200 8/8 [==============================] - 5s 614ms/step - loss: 0.2298 - mae_euclidean: 0.6974 - jacard_coef: 0.6278 - val_loss: 0.3614 - val_mae_euclidean: 1.5674 - val_jacard_coef: 0.4700 Epoch 75/200 8/8 [==============================] - 5s 614ms/step - loss: 0.2180 - mae_euclidean: 0.6765 - jacard_coef: 0.6429 - val_loss: 0.3565 - val_mae_euclidean: 1.3579 - val_jacard_coef: 0.4754 Epoch 76/200 8/8 [==============================] - 5s 611ms/step - loss: 0.2275 - mae_euclidean: 0.6522 - jacard_coef: 0.6305 - val_loss: 0.3503 - val_mae_euclidean: 1.1337 - val_jacard_coef: 0.4840 Epoch 77/200 8/8 [==============================] - 5s 614ms/step - loss: 0.2257 - mae_euclidean: 0.8305 - jacard_coef: 0.6336 - val_loss: 0.3945 - val_mae_euclidean: 1.4522 - val_jacard_coef: 0.4351 Epoch 78/200 8/8 [==============================] - 5s 614ms/step - loss: 0.2034 - mae_euclidean: 0.5747 - jacard_coef: 0.6624 - val_loss: 0.3643 - val_mae_euclidean: 1.6962 - val_jacard_coef: 0.4667 Epoch 79/200 8/8 [==============================] - 5s 610ms/step - loss: 0.1871 - mae_euclidean: 0.5508 - jacard_coef: 0.6855 - val_loss: 0.3742 - val_mae_euclidean: 1.9555 - val_jacard_coef: 0.4568 Epoch 80/200 8/8 [==============================] - 5s 611ms/step - loss: 0.1838 - mae_euclidean: 0.5627 - jacard_coef: 0.6904 - val_loss: 0.3814 - val_mae_euclidean: 1.9120 - val_jacard_coef: 0.4488 Epoch 81/200 8/8 [==============================] - 5s 607ms/step - loss: 0.1805 - mae_euclidean: 0.5079 - jacard_coef: 0.6950 - val_loss: 0.3629 - val_mae_euclidean: 1.6425 - val_jacard_coef: 0.4687 Epoch 82/200 8/8 [==============================] - 5s 609ms/step - loss: 0.1739 - mae_euclidean: 0.5638 - jacard_coef: 0.7045 - val_loss: 0.3749 - val_mae_euclidean: 1.5853 - val_jacard_coef: 0.4554 Epoch 83/200 8/8 [==============================] - 5s 607ms/step - loss: 0.1579 - mae_euclidean: 0.4554 - jacard_coef: 0.7274 - val_loss: 0.3806 - val_mae_euclidean: 1.5563 - val_jacard_coef: 0.4495 Epoch 84/200 8/8 [==============================] - 5s 608ms/step - loss: 0.1583 - mae_euclidean: 0.4432 - jacard_coef: 0.7271 - val_loss: 0.3628 - val_mae_euclidean: 1.3823 - val_jacard_coef: 0.4686 Epoch 85/200 8/8 [==============================] - 5s 610ms/step - loss: 0.1730 - mae_euclidean: 0.5104 - jacard_coef: 0.7056 - val_loss: 0.3454 - val_mae_euclidean: 1.1774 - val_jacard_coef: 0.4876 Epoch 86/200 8/8 [==============================] - 5s 607ms/step - loss: 0.1663 - mae_euclidean: 0.5925 - jacard_coef: 0.7153 - val_loss: 0.3511 - val_mae_euclidean: 1.3100 - val_jacard_coef: 0.4815 Epoch 87/200 8/8 [==============================] - 5s 607ms/step - loss: 0.1559 - mae_euclidean: 0.4611 - jacard_coef: 0.7313 - val_loss: 0.3890 - val_mae_euclidean: 1.4781 - val_jacard_coef: 0.4405 Epoch 88/200 8/8 [==============================] - 5s 606ms/step - loss: 0.1524 - mae_euclidean: 0.4709 - jacard_coef: 0.7356 - val_loss: 0.3519 - val_mae_euclidean: 1.2116 - val_jacard_coef: 0.4803 Epoch 89/200 8/8 [==============================] - 5s 608ms/step - loss: 0.1446 - mae_euclidean: 0.4077 - jacard_coef: 0.7477 - val_loss: 0.3719 - val_mae_euclidean: 1.2638 - val_jacard_coef: 0.4594 Epoch 90/200 8/8 [==============================] - 5s 607ms/step - loss: 0.1389 - mae_euclidean: 0.3582 - jacard_coef: 0.7563 - val_loss: 0.3564 - val_mae_euclidean: 1.1883 - val_jacard_coef: 0.4752 Epoch 91/200 8/8 [==============================] - 5s 608ms/step - loss: 0.1249 - mae_euclidean: 0.3778 - jacard_coef: 0.7783 - val_loss: 0.3552 - val_mae_euclidean: 1.2749 - val_jacard_coef: 0.4772 Epoch 92/200 8/8 [==============================] - 5s 606ms/step - loss: 0.1335 - mae_euclidean: 0.4555 - jacard_coef: 0.7650 - val_loss: 0.3507 - val_mae_euclidean: 1.2442 - val_jacard_coef: 0.4819 Epoch 93/200 8/8 [==============================] - 5s 609ms/step - loss: 0.1497 - mae_euclidean: 0.4695 - jacard_coef: 0.7410 - val_loss: 0.3444 - val_mae_euclidean: 1.1984 - val_jacard_coef: 0.4887 Epoch 94/200 8/8 [==============================] - 5s 606ms/step - loss: 0.1288 - mae_euclidean: 0.4333 - jacard_coef: 0.7721 - val_loss: 0.3549 - val_mae_euclidean: 1.1833 - val_jacard_coef: 0.4775 Epoch 95/200 8/8 [==============================] - 5s 607ms/step - loss: 0.1353 - mae_euclidean: 0.3887 - jacard_coef: 0.7622 - val_loss: 0.3653 - val_mae_euclidean: 1.3282 - val_jacard_coef: 0.4654 Epoch 96/200 8/8 [==============================] - 5s 611ms/step - loss: 0.1318 - mae_euclidean: 0.4792 - jacard_coef: 0.7676 - val_loss: 0.3388 - val_mae_euclidean: 1.1051 - val_jacard_coef: 0.4949 Epoch 97/200 8/8 [==============================] - 5s 611ms/step - loss: 0.1168 - mae_euclidean: 0.3406 - jacard_coef: 0.7910 - val_loss: 0.3398 - val_mae_euclidean: 1.2240 - val_jacard_coef: 0.4939 Epoch 98/200 8/8 [==============================] - 5s 611ms/step - loss: 0.1104 - mae_euclidean: 0.3391 - jacard_coef: 0.8014 - val_loss: 0.3479 - val_mae_euclidean: 1.1362 - val_jacard_coef: 0.4848 Epoch 99/200 8/8 [==============================] - 5s 613ms/step - loss: 0.1123 - mae_euclidean: 0.2849 - jacard_coef: 0.7984 - val_loss: 0.3441 - val_mae_euclidean: 1.0627 - val_jacard_coef: 0.4894 Epoch 100/200 8/8 [==============================] - 5s 617ms/step - loss: 0.1022 - mae_euclidean: 0.3117 - jacard_coef: 0.8148 - val_loss: 0.3405 - val_mae_euclidean: 1.1958 - val_jacard_coef: 0.4936 Epoch 101/200 8/8 [==============================] - 5s 616ms/step - loss: 0.0954 - mae_euclidean: 0.3330 - jacard_coef: 0.8261 - val_loss: 0.3541 - val_mae_euclidean: 0.9266 - val_jacard_coef: 0.4788 Epoch 102/200 8/8 [==============================] - 5s 617ms/step - loss: 0.0918 - mae_euclidean: 0.3050 - jacard_coef: 0.8320 - val_loss: 0.3516 - val_mae_euclidean: 1.1770 - val_jacard_coef: 0.4806 Epoch 103/200 8/8 [==============================] - 5s 615ms/step - loss: 0.0981 - mae_euclidean: 0.2992 - jacard_coef: 0.8220 - val_loss: 0.3535 - val_mae_euclidean: 1.1093 - val_jacard_coef: 0.4791 Epoch 104/200 8/8 [==============================] - 5s 615ms/step - loss: 0.1126 - mae_euclidean: 0.2683 - jacard_coef: 0.7987 - val_loss: 0.3457 - val_mae_euclidean: 1.1714 - val_jacard_coef: 0.4875 Epoch 105/200 8/8 [==============================] - 5s 613ms/step - loss: 0.1004 - mae_euclidean: 0.3487 - jacard_coef: 0.8185 - val_loss: 0.3579 - val_mae_euclidean: 1.1319 - val_jacard_coef: 0.4737 Epoch 106/200 8/8 [==============================] - 5s 615ms/step - loss: 0.1228 - mae_euclidean: 0.5543 - jacard_coef: 0.7822 - val_loss: 0.3720 - val_mae_euclidean: 1.3449 - val_jacard_coef: 0.4583 Epoch 107/200 8/8 [==============================] - 5s 613ms/step - loss: 0.1117 - mae_euclidean: 0.3522 - jacard_coef: 0.7993 - val_loss: 0.3556 - val_mae_euclidean: 1.1752 - val_jacard_coef: 0.4771 Epoch 108/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0983 - mae_euclidean: 0.2943 - jacard_coef: 0.8212 - val_loss: 0.3546 - val_mae_euclidean: 1.0912 - val_jacard_coef: 0.4782 Epoch 109/200 8/8 [==============================] - 5s 611ms/step - loss: 0.1011 - mae_euclidean: 0.3997 - jacard_coef: 0.8169 - val_loss: 0.3642 - val_mae_euclidean: 1.0987 - val_jacard_coef: 0.4670 Epoch 110/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0946 - mae_euclidean: 0.2842 - jacard_coef: 0.8277 - val_loss: 0.3403 - val_mae_euclidean: 1.1789 - val_jacard_coef: 0.4937 Epoch 111/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0798 - mae_euclidean: 0.2790 - jacard_coef: 0.8525 - val_loss: 0.3416 - val_mae_euclidean: 1.0845 - val_jacard_coef: 0.4924 Epoch 112/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0806 - mae_euclidean: 0.2435 - jacard_coef: 0.8516 - val_loss: 0.3555 - val_mae_euclidean: 1.2185 - val_jacard_coef: 0.4764 Epoch 113/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0787 - mae_euclidean: 0.2251 - jacard_coef: 0.8542 - val_loss: 0.3521 - val_mae_euclidean: 1.0745 - val_jacard_coef: 0.4799 Epoch 114/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0807 - mae_euclidean: 0.2702 - jacard_coef: 0.8509 - val_loss: 0.3586 - val_mae_euclidean: 1.2088 - val_jacard_coef: 0.4727 Epoch 115/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0701 - mae_euclidean: 0.2311 - jacard_coef: 0.8692 - val_loss: 0.3532 - val_mae_euclidean: 1.0219 - val_jacard_coef: 0.4789 Epoch 116/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0663 - mae_euclidean: 0.2212 - jacard_coef: 0.8759 - val_loss: 0.3498 - val_mae_euclidean: 1.1275 - val_jacard_coef: 0.4824 Epoch 117/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0687 - mae_euclidean: 0.2214 - jacard_coef: 0.8716 - val_loss: 0.3527 - val_mae_euclidean: 0.9479 - val_jacard_coef: 0.4797 Epoch 118/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0628 - mae_euclidean: 0.2101 - jacard_coef: 0.8820 - val_loss: 0.3505 - val_mae_euclidean: 0.9437 - val_jacard_coef: 0.4818 Epoch 119/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0647 - mae_euclidean: 0.1846 - jacard_coef: 0.8788 - val_loss: 0.3517 - val_mae_euclidean: 1.0621 - val_jacard_coef: 0.4811 Epoch 120/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0582 - mae_euclidean: 0.2892 - jacard_coef: 0.8901 - val_loss: 0.3434 - val_mae_euclidean: 1.0731 - val_jacard_coef: 0.4902 Epoch 121/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0607 - mae_euclidean: 0.2088 - jacard_coef: 0.8859 - val_loss: 0.3579 - val_mae_euclidean: 0.9268 - val_jacard_coef: 0.4742 Epoch 122/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0521 - mae_euclidean: 0.1639 - jacard_coef: 0.9011 - val_loss: 0.3495 - val_mae_euclidean: 0.9374 - val_jacard_coef: 0.4832 Epoch 123/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0542 - mae_euclidean: 0.1920 - jacard_coef: 0.8977 - val_loss: 0.3573 - val_mae_euclidean: 1.0320 - val_jacard_coef: 0.4749 Epoch 124/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0504 - mae_euclidean: 0.2032 - jacard_coef: 0.9042 - val_loss: 0.3450 - val_mae_euclidean: 1.0372 - val_jacard_coef: 0.4884 Epoch 125/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0505 - mae_euclidean: 0.1748 - jacard_coef: 0.9039 - val_loss: 0.3523 - val_mae_euclidean: 1.1676 - val_jacard_coef: 0.4806 Epoch 126/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0465 - mae_euclidean: 0.1763 - jacard_coef: 0.9113 - val_loss: 0.3408 - val_mae_euclidean: 0.8436 - val_jacard_coef: 0.4929 Epoch 127/200 8/8 [==============================] - 5s 614ms/step - loss: 0.0455 - mae_euclidean: 0.1743 - jacard_coef: 0.9130 - val_loss: 0.3517 - val_mae_euclidean: 1.2290 - val_jacard_coef: 0.4812 Epoch 128/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0432 - mae_euclidean: 0.1383 - jacard_coef: 0.9172 - val_loss: 0.3472 - val_mae_euclidean: 1.0750 - val_jacard_coef: 0.4861 Epoch 129/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0419 - mae_euclidean: 0.1315 - jacard_coef: 0.9198 - val_loss: 0.3579 - val_mae_euclidean: 1.0498 - val_jacard_coef: 0.4742 Epoch 130/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0419 - mae_euclidean: 0.1490 - jacard_coef: 0.9196 - val_loss: 0.3600 - val_mae_euclidean: 1.2206 - val_jacard_coef: 0.4717 Epoch 131/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0420 - mae_euclidean: 0.1740 - jacard_coef: 0.9195 - val_loss: 0.3464 - val_mae_euclidean: 0.9100 - val_jacard_coef: 0.4870 Epoch 132/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0409 - mae_euclidean: 0.1316 - jacard_coef: 0.9215 - val_loss: 0.3485 - val_mae_euclidean: 1.0660 - val_jacard_coef: 0.4848 Epoch 133/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0393 - mae_euclidean: 0.1261 - jacard_coef: 0.9245 - val_loss: 0.3496 - val_mae_euclidean: 1.0285 - val_jacard_coef: 0.4833 Epoch 134/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0403 - mae_euclidean: 0.1273 - jacard_coef: 0.9227 - val_loss: 0.3541 - val_mae_euclidean: 1.0455 - val_jacard_coef: 0.4790 Epoch 135/200 8/8 [==============================] - 5s 615ms/step - loss: 0.0388 - mae_euclidean: 0.1275 - jacard_coef: 0.9253 - val_loss: 0.3497 - val_mae_euclidean: 1.0033 - val_jacard_coef: 0.4834 Epoch 136/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0368 - mae_euclidean: 0.1586 - jacard_coef: 0.9290 - val_loss: 0.3454 - val_mae_euclidean: 0.9854 - val_jacard_coef: 0.4879 Epoch 137/200 8/8 [==============================] - 5s 617ms/step - loss: 0.0360 - mae_euclidean: 0.1179 - jacard_coef: 0.9307 - val_loss: 0.3541 - val_mae_euclidean: 0.9457 - val_jacard_coef: 0.4785 Epoch 138/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0367 - mae_euclidean: 0.1178 - jacard_coef: 0.9292 - val_loss: 0.3480 - val_mae_euclidean: 0.9147 - val_jacard_coef: 0.4852 Epoch 139/200 8/8 [==============================] - 5s 616ms/step - loss: 0.0332 - mae_euclidean: 0.1262 - jacard_coef: 0.9358 - val_loss: 0.3487 - val_mae_euclidean: 0.9969 - val_jacard_coef: 0.4843 Epoch 140/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0314 - mae_euclidean: 0.1159 - jacard_coef: 0.9393 - val_loss: 0.3451 - val_mae_euclidean: 1.0745 - val_jacard_coef: 0.4884 Epoch 141/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0322 - mae_euclidean: 0.1261 - jacard_coef: 0.9378 - val_loss: 0.3496 - val_mae_euclidean: 1.0258 - val_jacard_coef: 0.4835 Epoch 142/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0312 - mae_euclidean: 0.1128 - jacard_coef: 0.9396 - val_loss: 0.3457 - val_mae_euclidean: 1.0330 - val_jacard_coef: 0.4879 Epoch 143/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0298 - mae_euclidean: 0.1101 - jacard_coef: 0.9421 - val_loss: 0.3529 - val_mae_euclidean: 1.0199 - val_jacard_coef: 0.4798 Epoch 144/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0291 - mae_euclidean: 0.1123 - jacard_coef: 0.9435 - val_loss: 0.3482 - val_mae_euclidean: 1.0565 - val_jacard_coef: 0.4850 Epoch 145/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0288 - mae_euclidean: 0.1094 - jacard_coef: 0.9441 - val_loss: 0.3517 - val_mae_euclidean: 0.9107 - val_jacard_coef: 0.4812 Epoch 146/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0287 - mae_euclidean: 0.1089 - jacard_coef: 0.9443 - val_loss: 0.3469 - val_mae_euclidean: 1.0526 - val_jacard_coef: 0.4866 Epoch 147/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0281 - mae_euclidean: 0.1060 - jacard_coef: 0.9454 - val_loss: 0.3513 - val_mae_euclidean: 0.9456 - val_jacard_coef: 0.4815 Epoch 148/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0295 - mae_euclidean: 0.1263 - jacard_coef: 0.9428 - val_loss: 0.3457 - val_mae_euclidean: 0.9993 - val_jacard_coef: 0.4876 Epoch 149/200 8/8 [==============================] - 5s 616ms/step - loss: 0.0284 - mae_euclidean: 0.1150 - jacard_coef: 0.9448 - val_loss: 0.3484 - val_mae_euclidean: 0.9251 - val_jacard_coef: 0.4849 Epoch 150/200 8/8 [==============================] - 5s 616ms/step - loss: 0.0299 - mae_euclidean: 0.1059 - jacard_coef: 0.9421 - val_loss: 0.3494 - val_mae_euclidean: 0.8727 - val_jacard_coef: 0.4834 Epoch 151/200 8/8 [==============================] - 5s 616ms/step - loss: 0.0273 - mae_euclidean: 0.1053 - jacard_coef: 0.9470 - val_loss: 0.3454 - val_mae_euclidean: 0.8657 - val_jacard_coef: 0.4882 Epoch 152/200 8/8 [==============================] - 5s 617ms/step - loss: 0.0282 - mae_euclidean: 0.1054 - jacard_coef: 0.9452 - val_loss: 0.3497 - val_mae_euclidean: 0.9167 - val_jacard_coef: 0.4832 Epoch 153/200 8/8 [==============================] - 5s 614ms/step - loss: 0.0265 - mae_euclidean: 0.1047 - jacard_coef: 0.9485 - val_loss: 0.3491 - val_mae_euclidean: 0.9298 - val_jacard_coef: 0.4838 Epoch 154/200 8/8 [==============================] - 5s 614ms/step - loss: 0.0269 - mae_euclidean: 0.1033 - jacard_coef: 0.9476 - val_loss: 0.3488 - val_mae_euclidean: 0.9507 - val_jacard_coef: 0.4844 Epoch 155/200 8/8 [==============================] - 5s 616ms/step - loss: 0.0278 - mae_euclidean: 0.1045 - jacard_coef: 0.9460 - val_loss: 0.3483 - val_mae_euclidean: 0.9504 - val_jacard_coef: 0.4848 Epoch 156/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0273 - mae_euclidean: 0.1037 - jacard_coef: 0.9469 - val_loss: 0.3491 - val_mae_euclidean: 0.9390 - val_jacard_coef: 0.4841 Epoch 157/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0265 - mae_euclidean: 0.1031 - jacard_coef: 0.9484 - val_loss: 0.3467 - val_mae_euclidean: 1.0608 - val_jacard_coef: 0.4864 Epoch 158/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0280 - mae_euclidean: 0.1048 - jacard_coef: 0.9456 - val_loss: 0.3515 - val_mae_euclidean: 0.9522 - val_jacard_coef: 0.4814 Epoch 159/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0272 - mae_euclidean: 0.1049 - jacard_coef: 0.9471 - val_loss: 0.3456 - val_mae_euclidean: 0.9382 - val_jacard_coef: 0.4877 Epoch 160/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0256 - mae_euclidean: 0.1023 - jacard_coef: 0.9502 - val_loss: 0.3469 - val_mae_euclidean: 0.8817 - val_jacard_coef: 0.4866 Epoch 161/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0256 - mae_euclidean: 0.1017 - jacard_coef: 0.9502 - val_loss: 0.3457 - val_mae_euclidean: 0.8592 - val_jacard_coef: 0.4879 Epoch 162/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0252 - mae_euclidean: 0.1016 - jacard_coef: 0.9508 - val_loss: 0.3463 - val_mae_euclidean: 0.9182 - val_jacard_coef: 0.4870 Epoch 163/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0252 - mae_euclidean: 0.1007 - jacard_coef: 0.9508 - val_loss: 0.3467 - val_mae_euclidean: 0.9461 - val_jacard_coef: 0.4866 Epoch 164/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0251 - mae_euclidean: 0.1016 - jacard_coef: 0.9511 - val_loss: 0.3499 - val_mae_euclidean: 0.8847 - val_jacard_coef: 0.4834 Epoch 165/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0249 - mae_euclidean: 0.1012 - jacard_coef: 0.9513 - val_loss: 0.3468 - val_mae_euclidean: 0.9378 - val_jacard_coef: 0.4871 Epoch 166/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0259 - mae_euclidean: 0.1015 - jacard_coef: 0.9496 - val_loss: 0.3452 - val_mae_euclidean: 0.8768 - val_jacard_coef: 0.4884 Epoch 167/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0249 - mae_euclidean: 0.1012 - jacard_coef: 0.9514 - val_loss: 0.3467 - val_mae_euclidean: 0.9784 - val_jacard_coef: 0.4867 Epoch 168/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0255 - mae_euclidean: 0.1001 - jacard_coef: 0.9505 - val_loss: 0.3452 - val_mae_euclidean: 0.8669 - val_jacard_coef: 0.4884 Epoch 169/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0245 - mae_euclidean: 0.0995 - jacard_coef: 0.9522 - val_loss: 0.3438 - val_mae_euclidean: 0.8604 - val_jacard_coef: 0.4900 Epoch 170/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0242 - mae_euclidean: 0.1000 - jacard_coef: 0.9528 - val_loss: 0.3457 - val_mae_euclidean: 0.8262 - val_jacard_coef: 0.4878 Epoch 171/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0256 - mae_euclidean: 0.1043 - jacard_coef: 0.9502 - val_loss: 0.3455 - val_mae_euclidean: 0.8391 - val_jacard_coef: 0.4878 Epoch 172/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0264 - mae_euclidean: 0.1001 - jacard_coef: 0.9486 - val_loss: 0.3494 - val_mae_euclidean: 0.9405 - val_jacard_coef: 0.4840 Epoch 173/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0246 - mae_euclidean: 0.1021 - jacard_coef: 0.9521 - val_loss: 0.3484 - val_mae_euclidean: 0.8648 - val_jacard_coef: 0.4850 Epoch 174/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0247 - mae_euclidean: 0.1018 - jacard_coef: 0.9519 - val_loss: 0.3463 - val_mae_euclidean: 0.8529 - val_jacard_coef: 0.4872 Epoch 175/200 8/8 [==============================] - 5s 614ms/step - loss: 0.0251 - mae_euclidean: 0.0997 - jacard_coef: 0.9511 - val_loss: 0.3448 - val_mae_euclidean: 1.0493 - val_jacard_coef: 0.4887 Epoch 176/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0261 - mae_euclidean: 0.0997 - jacard_coef: 0.9492 - val_loss: 0.3434 - val_mae_euclidean: 0.8713 - val_jacard_coef: 0.4900 Epoch 177/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0243 - mae_euclidean: 0.0984 - jacard_coef: 0.9525 - val_loss: 0.3477 - val_mae_euclidean: 0.8527 - val_jacard_coef: 0.4857 Epoch 178/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0244 - mae_euclidean: 0.0978 - jacard_coef: 0.9524 - val_loss: 0.3457 - val_mae_euclidean: 0.9851 - val_jacard_coef: 0.4876 Epoch 179/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0242 - mae_euclidean: 0.0977 - jacard_coef: 0.9527 - val_loss: 0.3493 - val_mae_euclidean: 0.8550 - val_jacard_coef: 0.4838 Epoch 180/200 8/8 [==============================] - 5s 610ms/step - loss: 0.0249 - mae_euclidean: 0.0978 - jacard_coef: 0.9515 - val_loss: 0.3449 - val_mae_euclidean: 1.0683 - val_jacard_coef: 0.4887 Epoch 181/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0244 - mae_euclidean: 0.0976 - jacard_coef: 0.9525 - val_loss: 0.3439 - val_mae_euclidean: 0.9658 - val_jacard_coef: 0.4896 Epoch 182/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0240 - mae_euclidean: 0.0973 - jacard_coef: 0.9532 - val_loss: 0.3440 - val_mae_euclidean: 0.8300 - val_jacard_coef: 0.4896 Epoch 183/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0241 - mae_euclidean: 0.0978 - jacard_coef: 0.9530 - val_loss: 0.3467 - val_mae_euclidean: 0.9822 - val_jacard_coef: 0.4864 Epoch 184/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0242 - mae_euclidean: 0.0975 - jacard_coef: 0.9529 - val_loss: 0.3499 - val_mae_euclidean: 0.8985 - val_jacard_coef: 0.4830 Epoch 185/200 8/8 [==============================] - 5s 609ms/step - loss: 0.0232 - mae_euclidean: 0.0990 - jacard_coef: 0.9546 - val_loss: 0.3456 - val_mae_euclidean: 0.8932 - val_jacard_coef: 0.4878 Epoch 186/200 8/8 [==============================] - 5s 607ms/step - loss: 0.0226 - mae_euclidean: 0.0965 - jacard_coef: 0.9558 - val_loss: 0.3439 - val_mae_euclidean: 0.8962 - val_jacard_coef: 0.4898 Epoch 187/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0225 - mae_euclidean: 0.0963 - jacard_coef: 0.9560 - val_loss: 0.3425 - val_mae_euclidean: 0.9130 - val_jacard_coef: 0.4913 Epoch 188/200 8/8 [==============================] - 5s 612ms/step - loss: 0.0222 - mae_euclidean: 0.0974 - jacard_coef: 0.9566 - val_loss: 0.3404 - val_mae_euclidean: 0.9079 - val_jacard_coef: 0.4936 Epoch 189/200 8/8 [==============================] - 5s 613ms/step - loss: 0.0225 - mae_euclidean: 0.0955 - jacard_coef: 0.9560 - val_loss: 0.3400 - val_mae_euclidean: 0.8637 - val_jacard_coef: 0.4940 Epoch 190/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0219 - mae_euclidean: 0.0950 - jacard_coef: 0.9571 - val_loss: 0.3414 - val_mae_euclidean: 0.9499 - val_jacard_coef: 0.4925 Epoch 191/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0218 - mae_euclidean: 0.0949 - jacard_coef: 0.9573 - val_loss: 0.3415 - val_mae_euclidean: 0.9004 - val_jacard_coef: 0.4923 Epoch 192/200 8/8 [==============================] - 5s 607ms/step - loss: 0.0219 - mae_euclidean: 0.0944 - jacard_coef: 0.9571 - val_loss: 0.3411 - val_mae_euclidean: 0.8714 - val_jacard_coef: 0.4929 Epoch 193/200 8/8 [==============================] - 5s 606ms/step - loss: 0.0215 - mae_euclidean: 0.0943 - jacard_coef: 0.9579 - val_loss: 0.3416 - val_mae_euclidean: 0.8866 - val_jacard_coef: 0.4922 Epoch 194/200 8/8 [==============================] - 5s 607ms/step - loss: 0.0219 - mae_euclidean: 0.0943 - jacard_coef: 0.9571 - val_loss: 0.3423 - val_mae_euclidean: 1.0056 - val_jacard_coef: 0.4915 Epoch 195/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0232 - mae_euclidean: 0.0951 - jacard_coef: 0.9547 - val_loss: 0.3422 - val_mae_euclidean: 1.0017 - val_jacard_coef: 0.4916 Epoch 196/200 8/8 [==============================] - 5s 607ms/step - loss: 0.0226 - mae_euclidean: 0.0947 - jacard_coef: 0.9559 - val_loss: 0.3420 - val_mae_euclidean: 0.8921 - val_jacard_coef: 0.4918 Epoch 197/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0223 - mae_euclidean: 0.0940 - jacard_coef: 0.9565 - val_loss: 0.3426 - val_mae_euclidean: 0.9359 - val_jacard_coef: 0.4911 Epoch 198/200 8/8 [==============================] - 5s 604ms/step - loss: 0.0225 - mae_euclidean: 0.0943 - jacard_coef: 0.9560 - val_loss: 0.3419 - val_mae_euclidean: 0.9045 - val_jacard_coef: 0.4918 Epoch 199/200 8/8 [==============================] - 5s 611ms/step - loss: 0.0223 - mae_euclidean: 0.0941 - jacard_coef: 0.9564 - val_loss: 0.3412 - val_mae_euclidean: 0.9230 - val_jacard_coef: 0.4926 Epoch 200/200 8/8 [==============================] - 5s 608ms/step - loss: 0.0213 - mae_euclidean: 0.0945 - jacard_coef: 0.9583 - val_loss: 0.3427 - val_mae_euclidean: 0.9038 - val_jacard_coef: 0.4908 Model trained for 975.937887430191s
plot_history(history_unet1, "Unet")
model_unet1.evaluate(X_val, y_val)
1/1 [==============================] - 1s 977ms/step - loss: 0.3261 - mae_euclidean: 0.9038 - jacard_coef: 0.5082
[0.3261321187019348, 0.9037566781044006, 0.5081525444984436]
tf.keras.backend.clear_session()
# Attention residual Unet1
tf.random.set_seed(14)
model_att_res_unet1 = Attention_ResUNet(input_shape)
model_att_res_unet1.compile(optimizer=Adam(learning_rate=1e-2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_att_res_unet1 = model_att_res_unet1.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 2,
epochs=100
)
print(f"Model trained for {time.time() - start}s")
model_att_res_unet1.save("2022-02-17 att_res_UNet1 100epochs.hdf5")
Epoch 1/100 16/16 [==============================] - 21s 465ms/step - loss: 0.8110 - mae_euclidean: 3.5976 - jacard_coef: 0.1057 - val_loss: 0.8614 - val_mae_euclidean: 6.3043 - val_jacard_coef: 0.0756 Epoch 2/100 16/16 [==============================] - 7s 419ms/step - loss: 0.7632 - mae_euclidean: 3.8068 - jacard_coef: 0.1357 - val_loss: 0.9898 - val_mae_euclidean: 120.9765 - val_jacard_coef: 0.0052 Epoch 3/100 16/16 [==============================] - 7s 417ms/step - loss: 0.7324 - mae_euclidean: 3.6451 - jacard_coef: 0.1557 - val_loss: 0.9974 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0014 Epoch 4/100 16/16 [==============================] - 7s 418ms/step - loss: 0.7036 - mae_euclidean: 3.0018 - jacard_coef: 0.1753 - val_loss: 0.8520 - val_mae_euclidean: 6.2725 - val_jacard_coef: 0.0811 Epoch 5/100 16/16 [==============================] - 7s 418ms/step - loss: 0.6764 - mae_euclidean: 2.7265 - jacard_coef: 0.1947 - val_loss: 0.8727 - val_mae_euclidean: 41.8583 - val_jacard_coef: 0.0686 Epoch 6/100 16/16 [==============================] - 7s 419ms/step - loss: 0.6512 - mae_euclidean: 2.7744 - jacard_coef: 0.2145 - val_loss: 0.8903 - val_mae_euclidean: 40.4907 - val_jacard_coef: 0.0583 Epoch 7/100 16/16 [==============================] - 7s 419ms/step - loss: 0.6222 - mae_euclidean: 2.3448 - jacard_coef: 0.2364 - val_loss: 0.9147 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0448 Epoch 8/100 16/16 [==============================] - 7s 418ms/step - loss: 0.5890 - mae_euclidean: 2.1212 - jacard_coef: 0.2611 - val_loss: 0.9205 - val_mae_euclidean: 98.6868 - val_jacard_coef: 0.0416 Epoch 9/100 16/16 [==============================] - 7s 432ms/step - loss: 0.5765 - mae_euclidean: 2.1199 - jacard_coef: 0.2725 - val_loss: 0.9285 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0372 Epoch 10/100 16/16 [==============================] - 8s 473ms/step - loss: 0.5334 - mae_euclidean: 1.8830 - jacard_coef: 0.3067 - val_loss: 0.9321 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0352 Epoch 11/100 16/16 [==============================] - 8s 474ms/step - loss: 0.5203 - mae_euclidean: 1.9309 - jacard_coef: 0.3199 - val_loss: 0.9132 - val_mae_euclidean: 23.2299 - val_jacard_coef: 0.0456 Epoch 12/100 16/16 [==============================] - 8s 472ms/step - loss: 0.4959 - mae_euclidean: 1.8279 - jacard_coef: 0.3414 - val_loss: 0.8894 - val_mae_euclidean: 12.8182 - val_jacard_coef: 0.0589 Epoch 13/100 16/16 [==============================] - 8s 473ms/step - loss: 0.4647 - mae_euclidean: 1.3266 - jacard_coef: 0.3691 - val_loss: 0.9066 - val_mae_euclidean: 15.2139 - val_jacard_coef: 0.0493 Epoch 14/100 16/16 [==============================] - 8s 472ms/step - loss: 0.4450 - mae_euclidean: 1.2176 - jacard_coef: 0.3875 - val_loss: 0.8838 - val_mae_euclidean: 11.1062 - val_jacard_coef: 0.0623 Epoch 15/100 16/16 [==============================] - 8s 471ms/step - loss: 0.4182 - mae_euclidean: 1.1773 - jacard_coef: 0.4127 - val_loss: 0.9148 - val_mae_euclidean: 14.7130 - val_jacard_coef: 0.0448 Epoch 16/100 16/16 [==============================] - 8s 471ms/step - loss: 0.4000 - mae_euclidean: 0.9952 - jacard_coef: 0.4307 - val_loss: 0.7669 - val_mae_euclidean: 5.7636 - val_jacard_coef: 0.1343 Epoch 17/100 16/16 [==============================] - 8s 473ms/step - loss: 0.3946 - mae_euclidean: 1.0846 - jacard_coef: 0.4365 - val_loss: 0.8551 - val_mae_euclidean: 10.4260 - val_jacard_coef: 0.0792 Epoch 18/100 16/16 [==============================] - 8s 473ms/step - loss: 0.3859 - mae_euclidean: 1.0698 - jacard_coef: 0.4451 - val_loss: 0.8763 - val_mae_euclidean: 11.7109 - val_jacard_coef: 0.0666 Epoch 19/100 16/16 [==============================] - 8s 473ms/step - loss: 0.3790 - mae_euclidean: 1.0971 - jacard_coef: 0.4532 - val_loss: 0.6938 - val_mae_euclidean: 4.8333 - val_jacard_coef: 0.1835 Epoch 20/100 16/16 [==============================] - 8s 475ms/step - loss: 0.3669 - mae_euclidean: 1.1033 - jacard_coef: 0.4653 - val_loss: 0.8790 - val_mae_euclidean: 10.9072 - val_jacard_coef: 0.0650 Epoch 21/100 16/16 [==============================] - 8s 474ms/step - loss: 0.3524 - mae_euclidean: 0.9955 - jacard_coef: 0.4814 - val_loss: 0.7020 - val_mae_euclidean: 6.2855 - val_jacard_coef: 0.1768 Epoch 22/100 16/16 [==============================] - 8s 476ms/step - loss: 0.3274 - mae_euclidean: 0.8990 - jacard_coef: 0.5093 - val_loss: 0.7164 - val_mae_euclidean: 6.1164 - val_jacard_coef: 0.1679 Epoch 23/100 16/16 [==============================] - 8s 476ms/step - loss: 0.3191 - mae_euclidean: 0.8380 - jacard_coef: 0.5186 - val_loss: 0.6612 - val_mae_euclidean: 4.4105 - val_jacard_coef: 0.2067 Epoch 24/100 16/16 [==============================] - 8s 475ms/step - loss: 0.3023 - mae_euclidean: 0.7076 - jacard_coef: 0.5375 - val_loss: 0.7645 - val_mae_euclidean: 6.9083 - val_jacard_coef: 0.1354 Epoch 25/100 16/16 [==============================] - 8s 475ms/step - loss: 0.2877 - mae_euclidean: 0.7486 - jacard_coef: 0.5547 - val_loss: 0.5409 - val_mae_euclidean: 3.3326 - val_jacard_coef: 0.3002 Epoch 26/100 16/16 [==============================] - 8s 476ms/step - loss: 0.2816 - mae_euclidean: 0.6883 - jacard_coef: 0.5621 - val_loss: 0.7035 - val_mae_euclidean: 6.4583 - val_jacard_coef: 0.1762 Epoch 27/100 16/16 [==============================] - 8s 478ms/step - loss: 0.2690 - mae_euclidean: 0.6014 - jacard_coef: 0.5777 - val_loss: 0.6178 - val_mae_euclidean: 4.1309 - val_jacard_coef: 0.2402 Epoch 28/100 16/16 [==============================] - 8s 478ms/step - loss: 0.2637 - mae_euclidean: 0.7349 - jacard_coef: 0.5849 - val_loss: 0.6690 - val_mae_euclidean: 5.9285 - val_jacard_coef: 0.2004 Epoch 29/100 16/16 [==============================] - 8s 477ms/step - loss: 0.2643 - mae_euclidean: 0.7235 - jacard_coef: 0.5837 - val_loss: 0.5395 - val_mae_euclidean: 3.3782 - val_jacard_coef: 0.3012 Epoch 30/100 16/16 [==============================] - 8s 477ms/step - loss: 0.2625 - mae_euclidean: 0.5945 - jacard_coef: 0.5856 - val_loss: 0.5121 - val_mae_euclidean: 2.8072 - val_jacard_coef: 0.3248 Epoch 31/100 16/16 [==============================] - 8s 476ms/step - loss: 0.2660 - mae_euclidean: 0.6366 - jacard_coef: 0.5813 - val_loss: 0.5590 - val_mae_euclidean: 3.8379 - val_jacard_coef: 0.2850 Epoch 32/100 16/16 [==============================] - 8s 475ms/step - loss: 0.2416 - mae_euclidean: 0.5648 - jacard_coef: 0.6121 - val_loss: 0.5713 - val_mae_euclidean: 3.8166 - val_jacard_coef: 0.2749 Epoch 33/100 16/16 [==============================] - 8s 475ms/step - loss: 0.2245 - mae_euclidean: 0.4858 - jacard_coef: 0.6341 - val_loss: 0.5495 - val_mae_euclidean: 3.7925 - val_jacard_coef: 0.2921 Epoch 34/100 16/16 [==============================] - 8s 475ms/step - loss: 0.2012 - mae_euclidean: 0.4392 - jacard_coef: 0.6658 - val_loss: 0.4539 - val_mae_euclidean: 2.0257 - val_jacard_coef: 0.3765 Epoch 35/100 16/16 [==============================] - 8s 477ms/step - loss: 0.2106 - mae_euclidean: 0.4870 - jacard_coef: 0.6534 - val_loss: 0.5188 - val_mae_euclidean: 2.3950 - val_jacard_coef: 0.3180 Epoch 36/100 16/16 [==============================] - 8s 477ms/step - loss: 0.2064 - mae_euclidean: 0.4830 - jacard_coef: 0.6594 - val_loss: 0.4520 - val_mae_euclidean: 1.7992 - val_jacard_coef: 0.3784 Epoch 37/100 16/16 [==============================] - 8s 474ms/step - loss: 0.1969 - mae_euclidean: 0.4132 - jacard_coef: 0.6726 - val_loss: 0.4365 - val_mae_euclidean: 1.8925 - val_jacard_coef: 0.3933 Epoch 38/100 16/16 [==============================] - 8s 476ms/step - loss: 0.2154 - mae_euclidean: 0.4769 - jacard_coef: 0.6465 - val_loss: 0.4243 - val_mae_euclidean: 1.3546 - val_jacard_coef: 0.4046 Epoch 39/100 16/16 [==============================] - 8s 475ms/step - loss: 0.1780 - mae_euclidean: 0.3702 - jacard_coef: 0.6984 - val_loss: 0.3866 - val_mae_euclidean: 1.2989 - val_jacard_coef: 0.4434 Epoch 40/100 16/16 [==============================] - 8s 475ms/step - loss: 0.1585 - mae_euclidean: 0.3463 - jacard_coef: 0.7269 - val_loss: 0.3764 - val_mae_euclidean: 1.2643 - val_jacard_coef: 0.4541 Epoch 41/100 16/16 [==============================] - 8s 477ms/step - loss: 0.1556 - mae_euclidean: 0.2970 - jacard_coef: 0.7312 - val_loss: 0.3962 - val_mae_euclidean: 1.3295 - val_jacard_coef: 0.4326 Epoch 42/100 16/16 [==============================] - 8s 478ms/step - loss: 0.1609 - mae_euclidean: 0.3339 - jacard_coef: 0.7234 - val_loss: 0.3798 - val_mae_euclidean: 1.1066 - val_jacard_coef: 0.4504 Epoch 43/100 16/16 [==============================] - 8s 477ms/step - loss: 0.1460 - mae_euclidean: 0.3090 - jacard_coef: 0.7455 - val_loss: 0.3733 - val_mae_euclidean: 1.1689 - val_jacard_coef: 0.4572 Epoch 44/100 16/16 [==============================] - 8s 476ms/step - loss: 0.1542 - mae_euclidean: 0.3197 - jacard_coef: 0.7339 - val_loss: 0.4212 - val_mae_euclidean: 1.2646 - val_jacard_coef: 0.4105 Epoch 45/100 16/16 [==============================] - 8s 478ms/step - loss: 0.1514 - mae_euclidean: 0.2985 - jacard_coef: 0.7378 - val_loss: 0.4044 - val_mae_euclidean: 1.2754 - val_jacard_coef: 0.4258 Epoch 46/100 16/16 [==============================] - 8s 477ms/step - loss: 0.1614 - mae_euclidean: 0.3831 - jacard_coef: 0.7232 - val_loss: 0.3989 - val_mae_euclidean: 1.3110 - val_jacard_coef: 0.4314 Epoch 47/100 16/16 [==============================] - 8s 478ms/step - loss: 0.1466 - mae_euclidean: 0.3098 - jacard_coef: 0.7447 - val_loss: 0.4231 - val_mae_euclidean: 1.4616 - val_jacard_coef: 0.4059 Epoch 48/100 16/16 [==============================] - 8s 478ms/step - loss: 0.1323 - mae_euclidean: 0.2791 - jacard_coef: 0.7670 - val_loss: 0.3774 - val_mae_euclidean: 1.1209 - val_jacard_coef: 0.4525 Epoch 49/100 16/16 [==============================] - 8s 474ms/step - loss: 0.1174 - mae_euclidean: 0.2349 - jacard_coef: 0.7899 - val_loss: 0.4040 - val_mae_euclidean: 1.2483 - val_jacard_coef: 0.4253 Epoch 50/100 16/16 [==============================] - 8s 475ms/step - loss: 0.1109 - mae_euclidean: 0.2168 - jacard_coef: 0.8006 - val_loss: 0.3735 - val_mae_euclidean: 1.2091 - val_jacard_coef: 0.4577 Epoch 51/100 16/16 [==============================] - 8s 472ms/step - loss: 0.1075 - mae_euclidean: 0.2096 - jacard_coef: 0.8061 - val_loss: 0.3698 - val_mae_euclidean: 1.0784 - val_jacard_coef: 0.4613 Epoch 52/100 16/16 [==============================] - 8s 474ms/step - loss: 0.1143 - mae_euclidean: 0.2199 - jacard_coef: 0.7952 - val_loss: 0.3785 - val_mae_euclidean: 0.9192 - val_jacard_coef: 0.4516 Epoch 53/100 16/16 [==============================] - 8s 475ms/step - loss: 0.1095 - mae_euclidean: 0.2508 - jacard_coef: 0.8028 - val_loss: 0.3726 - val_mae_euclidean: 1.0267 - val_jacard_coef: 0.4577 Epoch 54/100 16/16 [==============================] - 8s 475ms/step - loss: 0.1101 - mae_euclidean: 0.2150 - jacard_coef: 0.8022 - val_loss: 0.3752 - val_mae_euclidean: 0.9777 - val_jacard_coef: 0.4555 Epoch 55/100 16/16 [==============================] - 8s 475ms/step - loss: 0.1048 - mae_euclidean: 0.1995 - jacard_coef: 0.8107 - val_loss: 0.3789 - val_mae_euclidean: 1.2099 - val_jacard_coef: 0.4520 Epoch 56/100 16/16 [==============================] - 8s 476ms/step - loss: 0.1250 - mae_euclidean: 0.2594 - jacard_coef: 0.7784 - val_loss: 0.3903 - val_mae_euclidean: 1.1307 - val_jacard_coef: 0.4396 Epoch 57/100 16/16 [==============================] - 8s 477ms/step - loss: 0.1031 - mae_euclidean: 0.2129 - jacard_coef: 0.8135 - val_loss: 0.3643 - val_mae_euclidean: 1.2201 - val_jacard_coef: 0.4664 Epoch 58/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0917 - mae_euclidean: 0.2168 - jacard_coef: 0.8322 - val_loss: 0.3622 - val_mae_euclidean: 1.2110 - val_jacard_coef: 0.4691 Epoch 59/100 16/16 [==============================] - 8s 478ms/step - loss: 0.0853 - mae_euclidean: 0.1790 - jacard_coef: 0.8431 - val_loss: 0.3515 - val_mae_euclidean: 1.1944 - val_jacard_coef: 0.4806 Epoch 60/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0966 - mae_euclidean: 0.1956 - jacard_coef: 0.8242 - val_loss: 0.4730 - val_mae_euclidean: 1.4013 - val_jacard_coef: 0.3588 Epoch 61/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0832 - mae_euclidean: 0.1926 - jacard_coef: 0.8467 - val_loss: 0.3623 - val_mae_euclidean: 1.2515 - val_jacard_coef: 0.4688 Epoch 62/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0794 - mae_euclidean: 0.1601 - jacard_coef: 0.8531 - val_loss: 0.3622 - val_mae_euclidean: 1.0571 - val_jacard_coef: 0.4688 Epoch 63/100 16/16 [==============================] - 8s 474ms/step - loss: 0.0748 - mae_euclidean: 0.1609 - jacard_coef: 0.8610 - val_loss: 0.3611 - val_mae_euclidean: 0.9694 - val_jacard_coef: 0.4701 Epoch 64/100 16/16 [==============================] - 8s 475ms/step - loss: 0.0838 - mae_euclidean: 0.1814 - jacard_coef: 0.8455 - val_loss: 0.3796 - val_mae_euclidean: 1.0853 - val_jacard_coef: 0.4503 Epoch 65/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0798 - mae_euclidean: 0.1697 - jacard_coef: 0.8523 - val_loss: 0.3598 - val_mae_euclidean: 1.1530 - val_jacard_coef: 0.4722 Epoch 66/100 16/16 [==============================] - 8s 478ms/step - loss: 0.0880 - mae_euclidean: 0.2241 - jacard_coef: 0.8383 - val_loss: 0.3674 - val_mae_euclidean: 1.0259 - val_jacard_coef: 0.4633 Epoch 67/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0771 - mae_euclidean: 0.1643 - jacard_coef: 0.8571 - val_loss: 0.3615 - val_mae_euclidean: 1.0411 - val_jacard_coef: 0.4695 Epoch 68/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0705 - mae_euclidean: 0.1486 - jacard_coef: 0.8686 - val_loss: 0.3519 - val_mae_euclidean: 0.9884 - val_jacard_coef: 0.4804 Epoch 69/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0653 - mae_euclidean: 0.1496 - jacard_coef: 0.8776 - val_loss: 0.3562 - val_mae_euclidean: 0.8855 - val_jacard_coef: 0.4754 Epoch 70/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0609 - mae_euclidean: 0.1370 - jacard_coef: 0.8855 - val_loss: 0.3601 - val_mae_euclidean: 1.0064 - val_jacard_coef: 0.4709 Epoch 71/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0601 - mae_euclidean: 0.1353 - jacard_coef: 0.8867 - val_loss: 0.3732 - val_mae_euclidean: 0.9871 - val_jacard_coef: 0.4574 Epoch 72/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0566 - mae_euclidean: 0.1291 - jacard_coef: 0.8931 - val_loss: 0.3588 - val_mae_euclidean: 1.0163 - val_jacard_coef: 0.4726 Epoch 73/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0550 - mae_euclidean: 0.1228 - jacard_coef: 0.8958 - val_loss: 0.3527 - val_mae_euclidean: 1.0132 - val_jacard_coef: 0.4795 Epoch 74/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0544 - mae_euclidean: 0.1280 - jacard_coef: 0.8970 - val_loss: 0.3556 - val_mae_euclidean: 1.0828 - val_jacard_coef: 0.4764 Epoch 75/100 16/16 [==============================] - 8s 478ms/step - loss: 0.0503 - mae_euclidean: 0.1241 - jacard_coef: 0.9045 - val_loss: 0.3596 - val_mae_euclidean: 1.1509 - val_jacard_coef: 0.4716 Epoch 76/100 16/16 [==============================] - 8s 478ms/step - loss: 0.0498 - mae_euclidean: 0.1230 - jacard_coef: 0.9052 - val_loss: 0.3589 - val_mae_euclidean: 1.0735 - val_jacard_coef: 0.4724 Epoch 77/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0479 - mae_euclidean: 0.1162 - jacard_coef: 0.9088 - val_loss: 0.3676 - val_mae_euclidean: 1.0676 - val_jacard_coef: 0.4630 Epoch 78/100 16/16 [==============================] - 8s 475ms/step - loss: 0.0528 - mae_euclidean: 0.1257 - jacard_coef: 0.8999 - val_loss: 0.3587 - val_mae_euclidean: 1.1314 - val_jacard_coef: 0.4728 Epoch 79/100 16/16 [==============================] - 8s 474ms/step - loss: 0.0479 - mae_euclidean: 0.1179 - jacard_coef: 0.9087 - val_loss: 0.3635 - val_mae_euclidean: 1.1336 - val_jacard_coef: 0.4674 Epoch 80/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0457 - mae_euclidean: 0.1141 - jacard_coef: 0.9127 - val_loss: 0.3480 - val_mae_euclidean: 1.0228 - val_jacard_coef: 0.4845 Epoch 81/100 16/16 [==============================] - 8s 475ms/step - loss: 0.0429 - mae_euclidean: 0.1069 - jacard_coef: 0.9179 - val_loss: 0.3548 - val_mae_euclidean: 1.1559 - val_jacard_coef: 0.4772 Epoch 82/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0441 - mae_euclidean: 0.1066 - jacard_coef: 0.9155 - val_loss: 0.3689 - val_mae_euclidean: 1.0511 - val_jacard_coef: 0.4619 Epoch 83/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0414 - mae_euclidean: 0.1047 - jacard_coef: 0.9206 - val_loss: 0.3570 - val_mae_euclidean: 0.9969 - val_jacard_coef: 0.4746 Epoch 84/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0389 - mae_euclidean: 0.1018 - jacard_coef: 0.9252 - val_loss: 0.3520 - val_mae_euclidean: 0.9345 - val_jacard_coef: 0.4806 Epoch 85/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0397 - mae_euclidean: 0.1069 - jacard_coef: 0.9237 - val_loss: 0.3639 - val_mae_euclidean: 0.9792 - val_jacard_coef: 0.4675 Epoch 86/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0376 - mae_euclidean: 0.1040 - jacard_coef: 0.9276 - val_loss: 0.3627 - val_mae_euclidean: 1.0703 - val_jacard_coef: 0.4686 Epoch 87/100 16/16 [==============================] - 8s 477ms/step - loss: 0.0371 - mae_euclidean: 0.1025 - jacard_coef: 0.9287 - val_loss: 0.3534 - val_mae_euclidean: 0.9960 - val_jacard_coef: 0.4791 Epoch 88/100 16/16 [==============================] - 8s 475ms/step - loss: 0.0359 - mae_euclidean: 0.1006 - jacard_coef: 0.9308 - val_loss: 0.3541 - val_mae_euclidean: 1.1114 - val_jacard_coef: 0.4782 Epoch 89/100 16/16 [==============================] - 8s 476ms/step - loss: 0.0350 - mae_euclidean: 0.0973 - jacard_coef: 0.9326 - val_loss: 0.3578 - val_mae_euclidean: 1.0285 - val_jacard_coef: 0.4744 Epoch 90/100 16/16 [==============================] - 8s 480ms/step - loss: 0.0339 - mae_euclidean: 0.0976 - jacard_coef: 0.9345 - val_loss: 0.3535 - val_mae_euclidean: 0.9266 - val_jacard_coef: 0.4786 Epoch 91/100 16/16 [==============================] - 8s 481ms/step - loss: 0.0332 - mae_euclidean: 0.0937 - jacard_coef: 0.9359 - val_loss: 0.3530 - val_mae_euclidean: 0.8607 - val_jacard_coef: 0.4795 Epoch 92/100 16/16 [==============================] - 8s 481ms/step - loss: 0.0323 - mae_euclidean: 0.0890 - jacard_coef: 0.9376 - val_loss: 0.3506 - val_mae_euclidean: 0.9518 - val_jacard_coef: 0.4822 Epoch 93/100 16/16 [==============================] - 8s 480ms/step - loss: 0.0319 - mae_euclidean: 0.0876 - jacard_coef: 0.9382 - val_loss: 0.3511 - val_mae_euclidean: 0.9609 - val_jacard_coef: 0.4815 Epoch 94/100 16/16 [==============================] - 8s 481ms/step - loss: 0.0306 - mae_euclidean: 0.0868 - jacard_coef: 0.9406 - val_loss: 0.3529 - val_mae_euclidean: 0.9655 - val_jacard_coef: 0.4794 Epoch 95/100 16/16 [==============================] - 8s 480ms/step - loss: 0.0309 - mae_euclidean: 0.0849 - jacard_coef: 0.9402 - val_loss: 0.3553 - val_mae_euclidean: 0.9034 - val_jacard_coef: 0.4768 Epoch 96/100 16/16 [==============================] - 8s 481ms/step - loss: 0.0298 - mae_euclidean: 0.0857 - jacard_coef: 0.9423 - val_loss: 0.3594 - val_mae_euclidean: 0.9253 - val_jacard_coef: 0.4724 Epoch 97/100 16/16 [==============================] - 8s 481ms/step - loss: 0.0290 - mae_euclidean: 0.0857 - jacard_coef: 0.9436 - val_loss: 0.3535 - val_mae_euclidean: 0.8919 - val_jacard_coef: 0.4787 Epoch 98/100 16/16 [==============================] - 8s 481ms/step - loss: 0.0289 - mae_euclidean: 0.0855 - jacard_coef: 0.9439 - val_loss: 0.3536 - val_mae_euclidean: 0.8694 - val_jacard_coef: 0.4789 Epoch 99/100 16/16 [==============================] - 8s 486ms/step - loss: 0.0276 - mae_euclidean: 0.1021 - jacard_coef: 0.9463 - val_loss: 0.3550 - val_mae_euclidean: 1.0335 - val_jacard_coef: 0.4770 Epoch 100/100 16/16 [==============================] - 8s 488ms/step - loss: 0.0281 - mae_euclidean: 0.0840 - jacard_coef: 0.9455 - val_loss: 0.3526 - val_mae_euclidean: 1.0255 - val_jacard_coef: 0.4797 Model trained for 764.92422914505s
C:\Users\a1048794\Anaconda3\lib\site-packages\keras\engine\functional.py:1410: CustomMaskWarning: Custom mask layers require a config and must override get_config. When loading, the custom mask layer must be passed to the custom_objects argument. layer_config = serialize_layer_fn(layer)
plot_history(history_att_res_unet1, "Attention residual Unet")
plot_train_val(model_att_res_unet1, 1, 1)
The results indicate the smallest model, SA UNet gives the best Jacard scores. It is supposed that the small number of channels in the network is the reason for better generalization. Bayesian optimization will be performed.
def model_builder_SA_UNet(hp):
hp_block_size = hp.Int('block_size', min_value=15, max_value=35, step=5) # bigger - more regularization
hp_keep_prob = hp.Float('keep_prob', min_value=0.5, max_value=0.9, step=0.1)
hp_start_neurons = hp.Int('start_neurons', min_value=10, max_value=30, step=5)
model = SA_UNet(input_shape, block_size=hp_block_size, keep_prob=hp_keep_prob, start_neurons=hp_start_neurons)
model.compile(optimizer = Adam(learning_rate = 1e-2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef])
return model
tuner = kt.BayesianOptimization(model_builder_SA_UNet,
objective = kt.Objective("val_mae_euclidean", direction='min'),
max_trials=16,
seed = 14,
directory=os.path.normpath('C:/keras_tuner'),
project_name='2022-02-16_SA_UNet_tune1')
INFO:tensorflow:Reloading Oracle from existing project C:\keras_tuner\2022-02-16_SA_UNet_tune1\oracle.json INFO:tensorflow:Reloading Tuner from C:\keras_tuner\2022-02-16_SA_UNet_tune1\tuner0.json
# stop_early = tf.keras.callbacks.EarlyStopping(monitor='val_jacard_coef',
# mode='max',
# patience=30)
tf.random.set_seed(14)
tuner.search(X_train, y_train,
validation_data = (X_val, y_val),
#callbacks=[stop_early],
batch_size = 8,
epochs=200
)
INFO:tensorflow:Oracle triggered exit
tuner.results_summary(num_trials=5)
Results summary Results in C:\keras_tuner\2022-02-16_SA_UNet_tune1 Showing 5 best trials Objective(name='val_mae_euclidean', direction='min') Trial summary Hyperparameters: block_size: 25 keep_prob: 0.7999999999999999 start_neurons: 20 Score: 0.8768714666366577 Trial summary Hyperparameters: block_size: 35 keep_prob: 0.8999999999999999 start_neurons: 30 Score: 0.9034335613250732 Trial summary Hyperparameters: block_size: 20 keep_prob: 0.8999999999999999 start_neurons: 30 Score: 0.9521717429161072 Trial summary Hyperparameters: block_size: 35 keep_prob: 0.8999999999999999 start_neurons: 10 Score: 0.9712845087051392 Trial summary Hyperparameters: block_size: 30 keep_prob: 0.8999999999999999 start_neurons: 20 Score: 0.97222501039505
Top 5 parameters produce similar scores with wide range of parameters (only keep_prob is always on the upper end). Block size in top 3 models varies between 25, 35, 20 producing objective within the noise.
# SA Unet2
# increasing the block size from 19 to 25 as recomended by the tuner
tf.random.set_seed(14)
model_sa_unet2 = SA_UNet(input_shape, block_size=25, keep_prob=0.8, start_neurons=20)
model_sa_unet2.compile(optimizer=Adam(learning_rate=1e-2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_sa_unet2 = model_sa_unet2.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8, # no resources for 16
epochs=200
)
print(f"Model trained for {time.time() - start}s")
model_sa_unet2.save("2022-02-17 SA-UNet2 200epochs.hdf5")
Epoch 1/200 4/4 [==============================] - 8s 565ms/step - loss: 0.8421 - mae_euclidean: 5.2554 - jacard_coef: 0.0865 - val_loss: 0.8597 - val_mae_euclidean: 6.0402 - val_jacard_coef: 0.0755 Epoch 2/200 4/4 [==============================] - 2s 463ms/step - loss: 0.7753 - mae_euclidean: 3.5628 - jacard_coef: 0.1268 - val_loss: 0.8938 - val_mae_euclidean: 5.5043 - val_jacard_coef: 0.0561 Epoch 3/200 4/4 [==============================] - 2s 462ms/step - loss: 0.7370 - mae_euclidean: 3.2331 - jacard_coef: 0.1515 - val_loss: 0.9131 - val_mae_euclidean: 5.3065 - val_jacard_coef: 0.0454 Epoch 4/200 4/4 [==============================] - 2s 464ms/step - loss: 0.6959 - mae_euclidean: 2.9099 - jacard_coef: 0.1794 - val_loss: 0.9897 - val_mae_euclidean: 62.5067 - val_jacard_coef: 0.0052 Epoch 5/200 4/4 [==============================] - 2s 464ms/step - loss: 0.6573 - mae_euclidean: 2.9397 - jacard_coef: 0.2074 - val_loss: 0.8808 - val_mae_euclidean: 5.6331 - val_jacard_coef: 0.0634 Epoch 6/200 4/4 [==============================] - 2s 462ms/step - loss: 0.6138 - mae_euclidean: 2.4209 - jacard_coef: 0.2395 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.2110e-05 Epoch 7/200 4/4 [==============================] - 2s 466ms/step - loss: 0.5826 - mae_euclidean: 2.4039 - jacard_coef: 0.2647 - val_loss: 0.9974 - val_mae_euclidean: 67.6274 - val_jacard_coef: 0.0013 Epoch 8/200 4/4 [==============================] - 2s 468ms/step - loss: 0.5418 - mae_euclidean: 2.1917 - jacard_coef: 0.2974 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.8672e-05 Epoch 9/200 4/4 [==============================] - 2s 463ms/step - loss: 0.5071 - mae_euclidean: 2.0722 - jacard_coef: 0.3274 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.4241e-05 Epoch 10/200 4/4 [==============================] - 2s 463ms/step - loss: 0.4765 - mae_euclidean: 1.6004 - jacard_coef: 0.3546 - val_loss: 0.9161 - val_mae_euclidean: 5.2070 - val_jacard_coef: 0.0438 Epoch 11/200 4/4 [==============================] - 2s 463ms/step - loss: 0.4598 - mae_euclidean: 1.5873 - jacard_coef: 0.3705 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7682e-05 Epoch 12/200 4/4 [==============================] - 2s 468ms/step - loss: 0.4388 - mae_euclidean: 1.6359 - jacard_coef: 0.3901 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7969e-05 Epoch 13/200 4/4 [==============================] - 2s 464ms/step - loss: 0.4296 - mae_euclidean: 1.6023 - jacard_coef: 0.3992 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.4880e-04 Epoch 14/200 4/4 [==============================] - 2s 466ms/step - loss: 0.4246 - mae_euclidean: 1.5134 - jacard_coef: 0.4042 - val_loss: 0.9982 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.2090e-04 Epoch 15/200 4/4 [==============================] - 2s 462ms/step - loss: 0.4101 - mae_euclidean: 1.4663 - jacard_coef: 0.4184 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.3014e-05 Epoch 16/200 4/4 [==============================] - 2s 466ms/step - loss: 0.4054 - mae_euclidean: 1.6057 - jacard_coef: 0.4232 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.9480e-05 Epoch 17/200 4/4 [==============================] - 2s 466ms/step - loss: 0.3959 - mae_euclidean: 1.3988 - jacard_coef: 0.4329 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7320e-05 Epoch 18/200 4/4 [==============================] - 2s 464ms/step - loss: 0.4023 - mae_euclidean: 1.4818 - jacard_coef: 0.4263 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8215e-05 Epoch 19/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3904 - mae_euclidean: 1.4536 - jacard_coef: 0.4385 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.9233e-05 Epoch 20/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3888 - mae_euclidean: 1.4948 - jacard_coef: 0.4402 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0692e-05 Epoch 21/200 4/4 [==============================] - 2s 467ms/step - loss: 0.3809 - mae_euclidean: 1.4691 - jacard_coef: 0.4486 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0520e-05 Epoch 22/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3741 - mae_euclidean: 1.4521 - jacard_coef: 0.4555 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0759e-05 Epoch 23/200 4/4 [==============================] - 2s 465ms/step - loss: 0.3817 - mae_euclidean: 1.3111 - jacard_coef: 0.4479 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.8421e-05 Epoch 24/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3637 - mae_euclidean: 1.2886 - jacard_coef: 0.4668 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0411e-05 Epoch 25/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3635 - mae_euclidean: 1.2626 - jacard_coef: 0.4669 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0675e-05 Epoch 26/200 4/4 [==============================] - 2s 466ms/step - loss: 0.3540 - mae_euclidean: 1.1819 - jacard_coef: 0.4772 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.1880e-05 Epoch 27/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3697 - mae_euclidean: 1.3256 - jacard_coef: 0.4607 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.5565e-05 Epoch 28/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3660 - mae_euclidean: 1.6460 - jacard_coef: 0.4652 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.4278e-05 Epoch 29/200 4/4 [==============================] - 2s 467ms/step - loss: 0.3513 - mae_euclidean: 1.1183 - jacard_coef: 0.4801 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.2345e-05 Epoch 30/200 4/4 [==============================] - 2s 460ms/step - loss: 0.3512 - mae_euclidean: 1.1970 - jacard_coef: 0.4803 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.5283e-05 Epoch 31/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3471 - mae_euclidean: 1.1714 - jacard_coef: 0.4848 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.2340e-05 Epoch 32/200 4/4 [==============================] - 2s 461ms/step - loss: 0.3482 - mae_euclidean: 1.2472 - jacard_coef: 0.4837 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.1170e-05 Epoch 33/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3440 - mae_euclidean: 1.3189 - jacard_coef: 0.4884 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.3057e-05 Epoch 34/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3440 - mae_euclidean: 1.1571 - jacard_coef: 0.4887 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.3485e-05 Epoch 35/200 4/4 [==============================] - 2s 467ms/step - loss: 0.3569 - mae_euclidean: 1.5092 - jacard_coef: 0.4758 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.8228e-05 Epoch 36/200 4/4 [==============================] - 2s 465ms/step - loss: 0.3342 - mae_euclidean: 1.1372 - jacard_coef: 0.4991 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.1091e-05 Epoch 37/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3284 - mae_euclidean: 1.0869 - jacard_coef: 0.5056 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0780e-05 Epoch 38/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3284 - mae_euclidean: 1.1075 - jacard_coef: 0.5058 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.1751e-05 Epoch 39/200 4/4 [==============================] - 2s 470ms/step - loss: 0.3216 - mae_euclidean: 1.1532 - jacard_coef: 0.5133 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.4959e-05 Epoch 40/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3224 - mae_euclidean: 1.2902 - jacard_coef: 0.5125 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0660e-05 Epoch 41/200 4/4 [==============================] - 2s 460ms/step - loss: 0.3214 - mae_euclidean: 1.0837 - jacard_coef: 0.5138 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.9514e-05 Epoch 42/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3226 - mae_euclidean: 1.2557 - jacard_coef: 0.5123 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.2355e-05 Epoch 43/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3144 - mae_euclidean: 1.0794 - jacard_coef: 0.5217 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.5395e-05 Epoch 44/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3166 - mae_euclidean: 1.1171 - jacard_coef: 0.5195 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.7375e-05 Epoch 45/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3140 - mae_euclidean: 1.1272 - jacard_coef: 0.5222 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1898e-04 Epoch 46/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3325 - mae_euclidean: 1.1674 - jacard_coef: 0.5013 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.7737e-05 Epoch 47/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3205 - mae_euclidean: 1.1249 - jacard_coef: 0.5147 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.7185e-05 Epoch 48/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3145 - mae_euclidean: 1.0473 - jacard_coef: 0.5218 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1352e-04 Epoch 49/200 4/4 [==============================] - 2s 467ms/step - loss: 0.3192 - mae_euclidean: 1.0638 - jacard_coef: 0.5162 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.3179e-05 Epoch 50/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3075 - mae_euclidean: 1.0588 - jacard_coef: 0.5299 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.2672e-04 Epoch 51/200 4/4 [==============================] - 2s 460ms/step - loss: 0.3135 - mae_euclidean: 1.0540 - jacard_coef: 0.5228 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.4840e-05 Epoch 52/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3026 - mae_euclidean: 0.9484 - jacard_coef: 0.5354 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.4827e-04 Epoch 53/200 4/4 [==============================] - 2s 461ms/step - loss: 0.3052 - mae_euclidean: 1.0470 - jacard_coef: 0.5323 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.7977e-05 Epoch 54/200 4/4 [==============================] - 2s 468ms/step - loss: 0.3030 - mae_euclidean: 1.0817 - jacard_coef: 0.5350 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.2459e-05 Epoch 55/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3019 - mae_euclidean: 0.9330 - jacard_coef: 0.5364 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.8881e-05 Epoch 56/200 4/4 [==============================] - 2s 462ms/step - loss: 0.3030 - mae_euclidean: 1.0946 - jacard_coef: 0.5350 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.4050e-04 Epoch 57/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3037 - mae_euclidean: 1.1249 - jacard_coef: 0.5341 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.0412e-04 Epoch 58/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3018 - mae_euclidean: 1.0745 - jacard_coef: 0.5365 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.5027e-04 Epoch 59/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3038 - mae_euclidean: 1.0821 - jacard_coef: 0.5341 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.5653e-05 Epoch 60/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3101 - mae_euclidean: 1.0127 - jacard_coef: 0.5268 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.8375e-05 Epoch 61/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2979 - mae_euclidean: 1.0571 - jacard_coef: 0.5410 - val_loss: 0.9998 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 1.4184e-04 Epoch 62/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3060 - mae_euclidean: 0.9863 - jacard_coef: 0.5315 - val_loss: 0.9996 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 2.3917e-04 Epoch 63/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3124 - mae_euclidean: 1.0006 - jacard_coef: 0.5241 - val_loss: 0.9998 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 1.4123e-04 Epoch 64/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3312 - mae_euclidean: 1.1581 - jacard_coef: 0.5030 - val_loss: 0.9987 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 6.8523e-04 Epoch 65/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3166 - mae_euclidean: 1.0481 - jacard_coef: 0.5192 - val_loss: 0.9954 - val_mae_euclidean: 53.5558 - val_jacard_coef: 0.0023 Epoch 66/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3252 - mae_euclidean: 1.2137 - jacard_coef: 0.5094 - val_loss: 0.9976 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0012 Epoch 67/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3082 - mae_euclidean: 1.0712 - jacard_coef: 0.5288 - val_loss: 0.9980 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0010 Epoch 68/200 4/4 [==============================] - 2s 464ms/step - loss: 0.3039 - mae_euclidean: 1.0123 - jacard_coef: 0.5340 - val_loss: 0.9977 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0012 Epoch 69/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2999 - mae_euclidean: 1.0624 - jacard_coef: 0.5388 - val_loss: 0.9959 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0021 Epoch 70/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2907 - mae_euclidean: 0.9332 - jacard_coef: 0.5496 - val_loss: 0.9967 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0017 Epoch 71/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2969 - mae_euclidean: 1.0269 - jacard_coef: 0.5426 - val_loss: 0.9990 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 4.9950e-04 Epoch 72/200 4/4 [==============================] - 2s 466ms/step - loss: 0.2947 - mae_euclidean: 1.0955 - jacard_coef: 0.5449 - val_loss: 0.9960 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0020 Epoch 73/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2886 - mae_euclidean: 1.0048 - jacard_coef: 0.5521 - val_loss: 0.9965 - val_mae_euclidean: 75.6255 - val_jacard_coef: 0.0018 Epoch 74/200 4/4 [==============================] - 2s 462ms/step - loss: 0.2956 - mae_euclidean: 1.1388 - jacard_coef: 0.5440 - val_loss: 0.9922 - val_mae_euclidean: 40.5433 - val_jacard_coef: 0.0040 Epoch 75/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2870 - mae_euclidean: 0.9634 - jacard_coef: 0.5541 - val_loss: 0.9926 - val_mae_euclidean: 51.5498 - val_jacard_coef: 0.0037 Epoch 76/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2862 - mae_euclidean: 0.9544 - jacard_coef: 0.5551 - val_loss: 0.9918 - val_mae_euclidean: 40.8082 - val_jacard_coef: 0.0041 Epoch 77/200 4/4 [==============================] - 2s 463ms/step - loss: 0.3021 - mae_euclidean: 1.3926 - jacard_coef: 0.5371 - val_loss: 0.9842 - val_mae_euclidean: 36.0560 - val_jacard_coef: 0.0080 Epoch 78/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2994 - mae_euclidean: 1.0757 - jacard_coef: 0.5392 - val_loss: 0.9969 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0016 Epoch 79/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2879 - mae_euclidean: 0.9651 - jacard_coef: 0.5531 - val_loss: 0.9957 - val_mae_euclidean: 51.9813 - val_jacard_coef: 0.0022 Epoch 80/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2880 - mae_euclidean: 0.9894 - jacard_coef: 0.5532 - val_loss: 0.9929 - val_mae_euclidean: 45.0087 - val_jacard_coef: 0.0036 Epoch 81/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2896 - mae_euclidean: 0.9247 - jacard_coef: 0.5511 - val_loss: 0.9968 - val_mae_euclidean: 56.5793 - val_jacard_coef: 0.0016 Epoch 82/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2798 - mae_euclidean: 1.0042 - jacard_coef: 0.5629 - val_loss: 0.9886 - val_mae_euclidean: 38.4394 - val_jacard_coef: 0.0058 Epoch 83/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2810 - mae_euclidean: 0.9206 - jacard_coef: 0.5614 - val_loss: 0.9918 - val_mae_euclidean: 60.0607 - val_jacard_coef: 0.0041 Epoch 84/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2836 - mae_euclidean: 0.8802 - jacard_coef: 0.5584 - val_loss: 0.9747 - val_mae_euclidean: 22.5367 - val_jacard_coef: 0.0128 Epoch 85/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2849 - mae_euclidean: 0.9825 - jacard_coef: 0.5566 - val_loss: 0.9705 - val_mae_euclidean: 23.1788 - val_jacard_coef: 0.0150 Epoch 86/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2837 - mae_euclidean: 1.0044 - jacard_coef: 0.5581 - val_loss: 0.9294 - val_mae_euclidean: 15.2648 - val_jacard_coef: 0.0366 Epoch 87/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2807 - mae_euclidean: 0.9504 - jacard_coef: 0.5618 - val_loss: 0.9484 - val_mae_euclidean: 16.6622 - val_jacard_coef: 0.0265 Epoch 88/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2785 - mae_euclidean: 0.8878 - jacard_coef: 0.5644 - val_loss: 0.9633 - val_mae_euclidean: 21.4981 - val_jacard_coef: 0.0187 Epoch 89/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2878 - mae_euclidean: 1.2572 - jacard_coef: 0.5537 - val_loss: 0.8424 - val_mae_euclidean: 9.4738 - val_jacard_coef: 0.0855 Epoch 90/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2838 - mae_euclidean: 1.0824 - jacard_coef: 0.5580 - val_loss: 0.7896 - val_mae_euclidean: 7.7327 - val_jacard_coef: 0.1176 Epoch 91/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2773 - mae_euclidean: 0.9602 - jacard_coef: 0.5658 - val_loss: 0.8412 - val_mae_euclidean: 9.9946 - val_jacard_coef: 0.0862 Epoch 92/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2738 - mae_euclidean: 1.0554 - jacard_coef: 0.5702 - val_loss: 0.8416 - val_mae_euclidean: 9.5445 - val_jacard_coef: 0.0860 Epoch 93/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2903 - mae_euclidean: 1.0668 - jacard_coef: 0.5502 - val_loss: 0.8601 - val_mae_euclidean: 9.4267 - val_jacard_coef: 0.0752 Epoch 94/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2884 - mae_euclidean: 1.1928 - jacard_coef: 0.5527 - val_loss: 0.7775 - val_mae_euclidean: 7.8594 - val_jacard_coef: 0.1252 Epoch 95/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2986 - mae_euclidean: 1.0856 - jacard_coef: 0.5426 - val_loss: 0.7478 - val_mae_euclidean: 6.8446 - val_jacard_coef: 0.1443 Epoch 96/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2725 - mae_euclidean: 1.0373 - jacard_coef: 0.5718 - val_loss: 0.6812 - val_mae_euclidean: 5.9544 - val_jacard_coef: 0.1897 Epoch 97/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2785 - mae_euclidean: 1.0637 - jacard_coef: 0.5646 - val_loss: 0.7318 - val_mae_euclidean: 6.3802 - val_jacard_coef: 0.1549 Epoch 98/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2749 - mae_euclidean: 1.0731 - jacard_coef: 0.5688 - val_loss: 0.7374 - val_mae_euclidean: 6.1436 - val_jacard_coef: 0.1512 Epoch 99/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2720 - mae_euclidean: 1.0631 - jacard_coef: 0.5723 - val_loss: 0.6205 - val_mae_euclidean: 4.8591 - val_jacard_coef: 0.2342 Epoch 100/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2643 - mae_euclidean: 0.8377 - jacard_coef: 0.5820 - val_loss: 0.6673 - val_mae_euclidean: 5.2511 - val_jacard_coef: 0.1996 Epoch 101/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2737 - mae_euclidean: 1.0931 - jacard_coef: 0.5704 - val_loss: 0.6703 - val_mae_euclidean: 5.4202 - val_jacard_coef: 0.1974 Epoch 102/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2737 - mae_euclidean: 0.9871 - jacard_coef: 0.5703 - val_loss: 0.6769 - val_mae_euclidean: 5.7058 - val_jacard_coef: 0.1927 Epoch 103/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2678 - mae_euclidean: 0.8935 - jacard_coef: 0.5775 - val_loss: 0.6196 - val_mae_euclidean: 4.9030 - val_jacard_coef: 0.2348 Epoch 104/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2673 - mae_euclidean: 0.9595 - jacard_coef: 0.5783 - val_loss: 0.7086 - val_mae_euclidean: 5.7214 - val_jacard_coef: 0.1706 Epoch 105/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2642 - mae_euclidean: 0.9188 - jacard_coef: 0.5822 - val_loss: 0.5900 - val_mae_euclidean: 4.3312 - val_jacard_coef: 0.2579 Epoch 106/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2659 - mae_euclidean: 1.0484 - jacard_coef: 0.5800 - val_loss: 0.5859 - val_mae_euclidean: 4.3147 - val_jacard_coef: 0.2611 Epoch 107/200 4/4 [==============================] - 2s 463ms/step - loss: 0.2643 - mae_euclidean: 0.9650 - jacard_coef: 0.5819 - val_loss: 0.5974 - val_mae_euclidean: 4.0939 - val_jacard_coef: 0.2520 Epoch 108/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2661 - mae_euclidean: 0.9911 - jacard_coef: 0.5797 - val_loss: 0.5716 - val_mae_euclidean: 4.2069 - val_jacard_coef: 0.2726 Epoch 109/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2687 - mae_euclidean: 0.8368 - jacard_coef: 0.5765 - val_loss: 0.5736 - val_mae_euclidean: 4.3976 - val_jacard_coef: 0.2709 Epoch 110/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2625 - mae_euclidean: 0.8152 - jacard_coef: 0.5843 - val_loss: 0.6724 - val_mae_euclidean: 5.5033 - val_jacard_coef: 0.1959 Epoch 111/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2562 - mae_euclidean: 0.9108 - jacard_coef: 0.5922 - val_loss: 0.5860 - val_mae_euclidean: 4.5665 - val_jacard_coef: 0.2610 Epoch 112/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2701 - mae_euclidean: 0.9214 - jacard_coef: 0.5757 - val_loss: 0.6249 - val_mae_euclidean: 4.6217 - val_jacard_coef: 0.2309 Epoch 113/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2632 - mae_euclidean: 1.0201 - jacard_coef: 0.5836 - val_loss: 0.6099 - val_mae_euclidean: 4.4375 - val_jacard_coef: 0.2423 Epoch 114/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2627 - mae_euclidean: 0.9691 - jacard_coef: 0.5841 - val_loss: 0.6012 - val_mae_euclidean: 4.5538 - val_jacard_coef: 0.2491 Epoch 115/200 4/4 [==============================] - 2s 462ms/step - loss: 0.2593 - mae_euclidean: 0.7963 - jacard_coef: 0.5884 - val_loss: 0.6097 - val_mae_euclidean: 4.3995 - val_jacard_coef: 0.2425 Epoch 116/200 4/4 [==============================] - 2s 465ms/step - loss: 0.2572 - mae_euclidean: 0.8343 - jacard_coef: 0.5911 - val_loss: 0.5553 - val_mae_euclidean: 4.2329 - val_jacard_coef: 0.2859 Epoch 117/200 4/4 [==============================] - 2s 464ms/step - loss: 0.2573 - mae_euclidean: 0.9376 - jacard_coef: 0.5910 - val_loss: 0.5808 - val_mae_euclidean: 4.5407 - val_jacard_coef: 0.2652 Epoch 118/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2503 - mae_euclidean: 0.8115 - jacard_coef: 0.5997 - val_loss: 0.4867 - val_mae_euclidean: 3.3884 - val_jacard_coef: 0.3452 Epoch 119/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2535 - mae_euclidean: 0.9091 - jacard_coef: 0.5956 - val_loss: 0.5432 - val_mae_euclidean: 4.1913 - val_jacard_coef: 0.2960 Epoch 120/200 4/4 [==============================] - 2s 493ms/step - loss: 0.2477 - mae_euclidean: 0.8881 - jacard_coef: 0.6031 - val_loss: 0.5000 - val_mae_euclidean: 3.5178 - val_jacard_coef: 0.3334 Epoch 121/200 4/4 [==============================] - 2s 492ms/step - loss: 0.2488 - mae_euclidean: 0.8178 - jacard_coef: 0.6016 - val_loss: 0.5268 - val_mae_euclidean: 3.7795 - val_jacard_coef: 0.3099 Epoch 122/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2527 - mae_euclidean: 0.8659 - jacard_coef: 0.5969 - val_loss: 0.4844 - val_mae_euclidean: 3.3347 - val_jacard_coef: 0.3473 Epoch 123/200 4/4 [==============================] - 2s 493ms/step - loss: 0.2618 - mae_euclidean: 0.9829 - jacard_coef: 0.5858 - val_loss: 0.5132 - val_mae_euclidean: 3.2820 - val_jacard_coef: 0.3217 Epoch 124/200 4/4 [==============================] - 2s 494ms/step - loss: 0.2496 - mae_euclidean: 0.8742 - jacard_coef: 0.6006 - val_loss: 0.4293 - val_mae_euclidean: 2.8648 - val_jacard_coef: 0.3993 Epoch 125/200 4/4 [==============================] - 2s 492ms/step - loss: 0.2503 - mae_euclidean: 0.9130 - jacard_coef: 0.6001 - val_loss: 0.4437 - val_mae_euclidean: 2.9306 - val_jacard_coef: 0.3853 Epoch 126/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2639 - mae_euclidean: 0.9117 - jacard_coef: 0.5827 - val_loss: 0.4745 - val_mae_euclidean: 2.9748 - val_jacard_coef: 0.3564 Epoch 127/200 4/4 [==============================] - 2s 491ms/step - loss: 0.2545 - mae_euclidean: 0.9523 - jacard_coef: 0.5943 - val_loss: 0.4262 - val_mae_euclidean: 2.8947 - val_jacard_coef: 0.4023 Epoch 128/200 4/4 [==============================] - 2s 491ms/step - loss: 0.2547 - mae_euclidean: 0.9562 - jacard_coef: 0.5941 - val_loss: 0.3900 - val_mae_euclidean: 2.8866 - val_jacard_coef: 0.4389 Epoch 129/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2517 - mae_euclidean: 0.9122 - jacard_coef: 0.5983 - val_loss: 0.4941 - val_mae_euclidean: 3.9090 - val_jacard_coef: 0.3386 Epoch 130/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2463 - mae_euclidean: 0.8212 - jacard_coef: 0.6049 - val_loss: 0.4677 - val_mae_euclidean: 3.2145 - val_jacard_coef: 0.3627 Epoch 131/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2461 - mae_euclidean: 0.9014 - jacard_coef: 0.6051 - val_loss: 0.4433 - val_mae_euclidean: 3.0033 - val_jacard_coef: 0.3857 Epoch 132/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2606 - mae_euclidean: 0.9452 - jacard_coef: 0.5870 - val_loss: 0.4518 - val_mae_euclidean: 3.0594 - val_jacard_coef: 0.3776 Epoch 133/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2434 - mae_euclidean: 0.9227 - jacard_coef: 0.6086 - val_loss: 0.3760 - val_mae_euclidean: 2.5775 - val_jacard_coef: 0.4535 Epoch 134/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2403 - mae_euclidean: 0.8312 - jacard_coef: 0.6127 - val_loss: 0.4469 - val_mae_euclidean: 2.6990 - val_jacard_coef: 0.3823 Epoch 135/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2437 - mae_euclidean: 0.8282 - jacard_coef: 0.6084 - val_loss: 0.3768 - val_mae_euclidean: 1.8702 - val_jacard_coef: 0.4526 Epoch 136/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2400 - mae_euclidean: 0.9424 - jacard_coef: 0.6130 - val_loss: 0.3766 - val_mae_euclidean: 2.0961 - val_jacard_coef: 0.4529 Epoch 137/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2415 - mae_euclidean: 0.8148 - jacard_coef: 0.6113 - val_loss: 0.4037 - val_mae_euclidean: 2.5450 - val_jacard_coef: 0.4248 Epoch 138/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2451 - mae_euclidean: 1.0723 - jacard_coef: 0.6068 - val_loss: 0.3705 - val_mae_euclidean: 1.3696 - val_jacard_coef: 0.4593 Epoch 139/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2360 - mae_euclidean: 0.7500 - jacard_coef: 0.6182 - val_loss: 0.3770 - val_mae_euclidean: 2.1402 - val_jacard_coef: 0.4525 Epoch 140/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2322 - mae_euclidean: 0.7313 - jacard_coef: 0.6231 - val_loss: 0.3523 - val_mae_euclidean: 1.7587 - val_jacard_coef: 0.4790 Epoch 141/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2425 - mae_euclidean: 0.9713 - jacard_coef: 0.6110 - val_loss: 0.3747 - val_mae_euclidean: 2.4750 - val_jacard_coef: 0.4549 Epoch 142/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2429 - mae_euclidean: 0.8125 - jacard_coef: 0.6092 - val_loss: 0.3493 - val_mae_euclidean: 1.7488 - val_jacard_coef: 0.4823 Epoch 143/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2374 - mae_euclidean: 0.8111 - jacard_coef: 0.6164 - val_loss: 0.3232 - val_mae_euclidean: 1.2571 - val_jacard_coef: 0.5115 Epoch 144/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2376 - mae_euclidean: 0.7906 - jacard_coef: 0.6162 - val_loss: 0.3196 - val_mae_euclidean: 1.4102 - val_jacard_coef: 0.5156 Epoch 145/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2369 - mae_euclidean: 0.9044 - jacard_coef: 0.6171 - val_loss: 0.3076 - val_mae_euclidean: 1.3635 - val_jacard_coef: 0.5295 Epoch 146/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2376 - mae_euclidean: 0.8754 - jacard_coef: 0.6162 - val_loss: 0.3154 - val_mae_euclidean: 1.6977 - val_jacard_coef: 0.5205 Epoch 147/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2346 - mae_euclidean: 0.8266 - jacard_coef: 0.6201 - val_loss: 0.3653 - val_mae_euclidean: 2.4151 - val_jacard_coef: 0.4649 Epoch 148/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2374 - mae_euclidean: 0.8903 - jacard_coef: 0.6164 - val_loss: 0.3165 - val_mae_euclidean: 1.8573 - val_jacard_coef: 0.5192 Epoch 149/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2434 - mae_euclidean: 0.8438 - jacard_coef: 0.6087 - val_loss: 0.3581 - val_mae_euclidean: 1.9169 - val_jacard_coef: 0.4727 Epoch 150/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2435 - mae_euclidean: 1.0658 - jacard_coef: 0.6086 - val_loss: 0.3118 - val_mae_euclidean: 1.1504 - val_jacard_coef: 0.5246 Epoch 151/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2381 - mae_euclidean: 0.7577 - jacard_coef: 0.6154 - val_loss: 0.3222 - val_mae_euclidean: 1.3423 - val_jacard_coef: 0.5126 Epoch 152/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2385 - mae_euclidean: 0.7845 - jacard_coef: 0.6152 - val_loss: 0.3269 - val_mae_euclidean: 1.7041 - val_jacard_coef: 0.5073 Epoch 153/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2362 - mae_euclidean: 0.9409 - jacard_coef: 0.6179 - val_loss: 0.3066 - val_mae_euclidean: 1.5807 - val_jacard_coef: 0.5307 Epoch 154/200 4/4 [==============================] - 2s 491ms/step - loss: 0.2392 - mae_euclidean: 0.8480 - jacard_coef: 0.6141 - val_loss: 0.3382 - val_mae_euclidean: 1.7508 - val_jacard_coef: 0.4945 Epoch 155/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2301 - mae_euclidean: 0.7987 - jacard_coef: 0.6260 - val_loss: 0.3287 - val_mae_euclidean: 1.7576 - val_jacard_coef: 0.5052 Epoch 156/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2286 - mae_euclidean: 0.8420 - jacard_coef: 0.6280 - val_loss: 0.3403 - val_mae_euclidean: 1.3617 - val_jacard_coef: 0.4922 Epoch 157/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2259 - mae_euclidean: 0.7292 - jacard_coef: 0.6317 - val_loss: 0.3199 - val_mae_euclidean: 1.1489 - val_jacard_coef: 0.5153 Epoch 158/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2350 - mae_euclidean: 0.8872 - jacard_coef: 0.6195 - val_loss: 0.3288 - val_mae_euclidean: 1.1481 - val_jacard_coef: 0.5051 Epoch 159/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2282 - mae_euclidean: 0.7770 - jacard_coef: 0.6286 - val_loss: 0.3229 - val_mae_euclidean: 1.1743 - val_jacard_coef: 0.5118 Epoch 160/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2243 - mae_euclidean: 0.8462 - jacard_coef: 0.6337 - val_loss: 0.3185 - val_mae_euclidean: 1.1317 - val_jacard_coef: 0.5169 Epoch 161/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2202 - mae_euclidean: 0.7995 - jacard_coef: 0.6392 - val_loss: 0.3170 - val_mae_euclidean: 1.1560 - val_jacard_coef: 0.5186 Epoch 162/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2156 - mae_euclidean: 0.6882 - jacard_coef: 0.6454 - val_loss: 0.3211 - val_mae_euclidean: 1.4405 - val_jacard_coef: 0.5139 Epoch 163/200 4/4 [==============================] - 2s 487ms/step - loss: 0.2155 - mae_euclidean: 0.8042 - jacard_coef: 0.6455 - val_loss: 0.3325 - val_mae_euclidean: 1.3346 - val_jacard_coef: 0.5009 Epoch 164/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2155 - mae_euclidean: 0.7525 - jacard_coef: 0.6454 - val_loss: 0.3107 - val_mae_euclidean: 1.2859 - val_jacard_coef: 0.5259 Epoch 165/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2138 - mae_euclidean: 0.7174 - jacard_coef: 0.6478 - val_loss: 0.3116 - val_mae_euclidean: 1.2369 - val_jacard_coef: 0.5249 Epoch 166/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2130 - mae_euclidean: 0.7729 - jacard_coef: 0.6488 - val_loss: 0.3177 - val_mae_euclidean: 1.2143 - val_jacard_coef: 0.5178 Epoch 167/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2154 - mae_euclidean: 0.7856 - jacard_coef: 0.6457 - val_loss: 0.3298 - val_mae_euclidean: 1.6428 - val_jacard_coef: 0.5040 Epoch 168/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2201 - mae_euclidean: 0.9091 - jacard_coef: 0.6394 - val_loss: 0.3280 - val_mae_euclidean: 1.8519 - val_jacard_coef: 0.5060 Epoch 169/200 4/4 [==============================] - 2s 487ms/step - loss: 0.2149 - mae_euclidean: 0.7518 - jacard_coef: 0.6464 - val_loss: 0.3281 - val_mae_euclidean: 1.9307 - val_jacard_coef: 0.5059 Epoch 170/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2144 - mae_euclidean: 0.7051 - jacard_coef: 0.6470 - val_loss: 0.2990 - val_mae_euclidean: 1.6582 - val_jacard_coef: 0.5397 Epoch 171/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2132 - mae_euclidean: 0.8076 - jacard_coef: 0.6487 - val_loss: 0.2982 - val_mae_euclidean: 1.5595 - val_jacard_coef: 0.5406 Epoch 172/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2267 - mae_euclidean: 0.8468 - jacard_coef: 0.6305 - val_loss: 0.3344 - val_mae_euclidean: 1.8893 - val_jacard_coef: 0.4988 Epoch 173/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2108 - mae_euclidean: 0.7051 - jacard_coef: 0.6519 - val_loss: 0.3186 - val_mae_euclidean: 1.5555 - val_jacard_coef: 0.5168 Epoch 174/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2082 - mae_euclidean: 0.6649 - jacard_coef: 0.6555 - val_loss: 0.3150 - val_mae_euclidean: 1.3060 - val_jacard_coef: 0.5209 Epoch 175/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2073 - mae_euclidean: 0.8185 - jacard_coef: 0.6567 - val_loss: 0.3110 - val_mae_euclidean: 1.2852 - val_jacard_coef: 0.5256 Epoch 176/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2115 - mae_euclidean: 0.7972 - jacard_coef: 0.6512 - val_loss: 0.3038 - val_mae_euclidean: 1.1381 - val_jacard_coef: 0.5340 Epoch 177/200 4/4 [==============================] - 2s 493ms/step - loss: 0.2096 - mae_euclidean: 0.7901 - jacard_coef: 0.6535 - val_loss: 0.3349 - val_mae_euclidean: 1.7262 - val_jacard_coef: 0.4983 Epoch 178/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2042 - mae_euclidean: 0.8160 - jacard_coef: 0.6610 - val_loss: 0.3182 - val_mae_euclidean: 1.7814 - val_jacard_coef: 0.5172 Epoch 179/200 4/4 [==============================] - 2s 489ms/step - loss: 0.2059 - mae_euclidean: 0.6667 - jacard_coef: 0.6586 - val_loss: 0.3338 - val_mae_euclidean: 2.4323 - val_jacard_coef: 0.4995 Epoch 180/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2043 - mae_euclidean: 0.7230 - jacard_coef: 0.6609 - val_loss: 0.3112 - val_mae_euclidean: 1.6187 - val_jacard_coef: 0.5253 Epoch 181/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2052 - mae_euclidean: 0.7401 - jacard_coef: 0.6595 - val_loss: 0.3224 - val_mae_euclidean: 2.2578 - val_jacard_coef: 0.5125 Epoch 182/200 4/4 [==============================] - 2s 491ms/step - loss: 0.2080 - mae_euclidean: 0.7641 - jacard_coef: 0.6559 - val_loss: 0.3213 - val_mae_euclidean: 1.2278 - val_jacard_coef: 0.5136 Epoch 183/200 4/4 [==============================] - 2s 491ms/step - loss: 0.2130 - mae_euclidean: 0.7163 - jacard_coef: 0.6490 - val_loss: 0.3161 - val_mae_euclidean: 1.4161 - val_jacard_coef: 0.5197 Epoch 184/200 4/4 [==============================] - 2s 492ms/step - loss: 0.2037 - mae_euclidean: 0.7320 - jacard_coef: 0.6617 - val_loss: 0.3182 - val_mae_euclidean: 1.4097 - val_jacard_coef: 0.5172 Epoch 185/200 4/4 [==============================] - 2s 492ms/step - loss: 0.2008 - mae_euclidean: 0.6395 - jacard_coef: 0.6657 - val_loss: 0.3131 - val_mae_euclidean: 2.2281 - val_jacard_coef: 0.5231 Epoch 186/200 4/4 [==============================] - 2s 491ms/step - loss: 0.2004 - mae_euclidean: 0.6047 - jacard_coef: 0.6662 - val_loss: 0.3112 - val_mae_euclidean: 1.2620 - val_jacard_coef: 0.5254 Epoch 187/200 4/4 [==============================] - 2s 492ms/step - loss: 0.2047 - mae_euclidean: 0.8512 - jacard_coef: 0.6602 - val_loss: 0.3064 - val_mae_euclidean: 1.2930 - val_jacard_coef: 0.5310 Epoch 188/200 4/4 [==============================] - 2s 490ms/step - loss: 0.1989 - mae_euclidean: 0.7427 - jacard_coef: 0.6683 - val_loss: 0.3171 - val_mae_euclidean: 1.3153 - val_jacard_coef: 0.5185 Epoch 189/200 4/4 [==============================] - 2s 490ms/step - loss: 0.1979 - mae_euclidean: 0.7128 - jacard_coef: 0.6696 - val_loss: 0.3051 - val_mae_euclidean: 1.2965 - val_jacard_coef: 0.5324 Epoch 190/200 4/4 [==============================] - 2s 489ms/step - loss: 0.1927 - mae_euclidean: 0.6129 - jacard_coef: 0.6770 - val_loss: 0.3289 - val_mae_euclidean: 1.4021 - val_jacard_coef: 0.5050 Epoch 191/200 4/4 [==============================] - 2s 490ms/step - loss: 0.1899 - mae_euclidean: 0.6343 - jacard_coef: 0.6808 - val_loss: 0.3276 - val_mae_euclidean: 1.4166 - val_jacard_coef: 0.5065 Epoch 192/200 4/4 [==============================] - 2s 490ms/step - loss: 0.1924 - mae_euclidean: 0.6873 - jacard_coef: 0.6773 - val_loss: 0.3116 - val_mae_euclidean: 1.3575 - val_jacard_coef: 0.5249 Epoch 193/200 4/4 [==============================] - 2s 488ms/step - loss: 0.1929 - mae_euclidean: 0.7525 - jacard_coef: 0.6766 - val_loss: 0.3201 - val_mae_euclidean: 1.3801 - val_jacard_coef: 0.5150 Epoch 194/200 4/4 [==============================] - 2s 492ms/step - loss: 0.1964 - mae_euclidean: 0.8713 - jacard_coef: 0.6720 - val_loss: 0.3264 - val_mae_euclidean: 1.3174 - val_jacard_coef: 0.5078 Epoch 195/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2125 - mae_euclidean: 1.0465 - jacard_coef: 0.6499 - val_loss: 0.3508 - val_mae_euclidean: 2.5056 - val_jacard_coef: 0.4806 Epoch 196/200 4/4 [==============================] - 2s 490ms/step - loss: 0.2129 - mae_euclidean: 0.8019 - jacard_coef: 0.6493 - val_loss: 0.3201 - val_mae_euclidean: 1.6354 - val_jacard_coef: 0.5150 Epoch 197/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2100 - mae_euclidean: 0.9021 - jacard_coef: 0.6532 - val_loss: 0.3269 - val_mae_euclidean: 1.2411 - val_jacard_coef: 0.5073 Epoch 198/200 4/4 [==============================] - 2s 488ms/step - loss: 0.2019 - mae_euclidean: 1.1415 - jacard_coef: 0.6641 - val_loss: 0.3451 - val_mae_euclidean: 2.2220 - val_jacard_coef: 0.4869 Epoch 199/200 4/4 [==============================] - 2s 488ms/step - loss: 0.1958 - mae_euclidean: 0.7096 - jacard_coef: 0.6726 - val_loss: 0.3542 - val_mae_euclidean: 1.5990 - val_jacard_coef: 0.4769 Epoch 200/200 4/4 [==============================] - 2s 488ms/step - loss: 0.1968 - mae_euclidean: 0.7984 - jacard_coef: 0.6712 - val_loss: 0.3165 - val_mae_euclidean: 1.1171 - val_jacard_coef: 0.5191 Model trained for 382.39845991134644s
plot_history(history_sa_unet2, "SA Unet")
model_sa_unet2.evaluate(X_val, y_val)
1/1 [==============================] - 0s 352ms/step - loss: 0.3165 - mae_euclidean: 1.1171 - jacard_coef: 0.5191
[0.3165448307991028, 1.1170706748962402, 0.5191344618797302]
Small trend for increasing validation loss - a sign for overfitting. The val_mae_euclidean (the objective of the tune) is smaller (good), but the jacard is significantly smaller (not good) than model_sa_unet1.
plot_train_val(model_sa_unet2, 1, 1)
# continue trainning with smaller learning rate
model_sa_unet2a = tf.keras.models.clone_model(model_sa_unet2)
model_sa_unet2a.set_weights(model_sa_unet2.get_weights())
model_sa_unet2a.compile(optimizer=Adam(learning_rate=1e-3),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
history_sa_unet2a = model_sa_unet2a.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8,
epochs=100
)
model_sa_unet2a.save("2022-02-17 best SA-UNet2a 200+100epochs.hdf5")
Epoch 1/100 4/4 [==============================] - 6s 584ms/step - loss: 0.1958 - mae_euclidean: 0.7189 - jacard_coef: 0.6725 - val_loss: 0.3379 - val_mae_euclidean: 1.1639 - val_jacard_coef: 0.4949 Epoch 2/100 4/4 [==============================] - 2s 487ms/step - loss: 0.1888 - mae_euclidean: 0.6815 - jacard_coef: 0.6825 - val_loss: 0.3262 - val_mae_euclidean: 1.2222 - val_jacard_coef: 0.5081 Epoch 3/100 4/4 [==============================] - 2s 487ms/step - loss: 0.1839 - mae_euclidean: 0.5711 - jacard_coef: 0.6896 - val_loss: 0.3140 - val_mae_euclidean: 1.2105 - val_jacard_coef: 0.5221 Epoch 4/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1813 - mae_euclidean: 0.7065 - jacard_coef: 0.6931 - val_loss: 0.3113 - val_mae_euclidean: 1.2141 - val_jacard_coef: 0.5252 Epoch 5/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1842 - mae_euclidean: 0.7839 - jacard_coef: 0.6890 - val_loss: 0.3112 - val_mae_euclidean: 1.1063 - val_jacard_coef: 0.5254 Epoch 6/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1810 - mae_euclidean: 0.6225 - jacard_coef: 0.6935 - val_loss: 0.3116 - val_mae_euclidean: 1.1226 - val_jacard_coef: 0.5249 Epoch 7/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1860 - mae_euclidean: 0.7409 - jacard_coef: 0.6865 - val_loss: 0.3110 - val_mae_euclidean: 1.2522 - val_jacard_coef: 0.5255 Epoch 8/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1825 - mae_euclidean: 0.7333 - jacard_coef: 0.6916 - val_loss: 0.3113 - val_mae_euclidean: 1.2125 - val_jacard_coef: 0.5252 Epoch 9/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1807 - mae_euclidean: 0.7355 - jacard_coef: 0.6939 - val_loss: 0.3132 - val_mae_euclidean: 1.2119 - val_jacard_coef: 0.5231 Epoch 10/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1750 - mae_euclidean: 0.5542 - jacard_coef: 0.7021 - val_loss: 0.3115 - val_mae_euclidean: 1.1770 - val_jacard_coef: 0.5250 Epoch 11/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1778 - mae_euclidean: 0.6605 - jacard_coef: 0.6983 - val_loss: 0.3123 - val_mae_euclidean: 1.2942 - val_jacard_coef: 0.5241 Epoch 12/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1736 - mae_euclidean: 0.6087 - jacard_coef: 0.7042 - val_loss: 0.3145 - val_mae_euclidean: 1.2991 - val_jacard_coef: 0.5215 Epoch 13/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1781 - mae_euclidean: 0.6521 - jacard_coef: 0.6977 - val_loss: 0.3222 - val_mae_euclidean: 1.3438 - val_jacard_coef: 0.5127 Epoch 14/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1813 - mae_euclidean: 0.6311 - jacard_coef: 0.6932 - val_loss: 0.3239 - val_mae_euclidean: 1.3467 - val_jacard_coef: 0.5107 Epoch 15/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1719 - mae_euclidean: 0.5984 - jacard_coef: 0.7067 - val_loss: 0.3233 - val_mae_euclidean: 1.2683 - val_jacard_coef: 0.5113 Epoch 16/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1739 - mae_euclidean: 0.6056 - jacard_coef: 0.7037 - val_loss: 0.3272 - val_mae_euclidean: 1.2816 - val_jacard_coef: 0.5070 Epoch 17/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1720 - mae_euclidean: 0.6224 - jacard_coef: 0.7066 - val_loss: 0.3291 - val_mae_euclidean: 1.6318 - val_jacard_coef: 0.5048 Epoch 18/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1750 - mae_euclidean: 0.7930 - jacard_coef: 0.7021 - val_loss: 0.3263 - val_mae_euclidean: 1.3613 - val_jacard_coef: 0.5080 Epoch 19/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1734 - mae_euclidean: 0.6843 - jacard_coef: 0.7045 - val_loss: 0.3221 - val_mae_euclidean: 1.3324 - val_jacard_coef: 0.5127 Epoch 20/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1733 - mae_euclidean: 0.6592 - jacard_coef: 0.7047 - val_loss: 0.3221 - val_mae_euclidean: 1.3410 - val_jacard_coef: 0.5128 Epoch 21/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1723 - mae_euclidean: 0.5644 - jacard_coef: 0.7063 - val_loss: 0.3225 - val_mae_euclidean: 1.3571 - val_jacard_coef: 0.5123 Epoch 22/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1687 - mae_euclidean: 0.5793 - jacard_coef: 0.7114 - val_loss: 0.3229 - val_mae_euclidean: 1.3493 - val_jacard_coef: 0.5119 Epoch 23/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1756 - mae_euclidean: 0.7885 - jacard_coef: 0.7014 - val_loss: 0.3221 - val_mae_euclidean: 1.3621 - val_jacard_coef: 0.5127 Epoch 24/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1690 - mae_euclidean: 0.5017 - jacard_coef: 0.7111 - val_loss: 0.3233 - val_mae_euclidean: 1.3709 - val_jacard_coef: 0.5114 Epoch 25/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1714 - mae_euclidean: 0.6286 - jacard_coef: 0.7074 - val_loss: 0.3268 - val_mae_euclidean: 1.6570 - val_jacard_coef: 0.5073 Epoch 26/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1681 - mae_euclidean: 0.5265 - jacard_coef: 0.7122 - val_loss: 0.3269 - val_mae_euclidean: 1.6446 - val_jacard_coef: 0.5072 Epoch 27/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1742 - mae_euclidean: 0.7484 - jacard_coef: 0.7036 - val_loss: 0.3276 - val_mae_euclidean: 1.6376 - val_jacard_coef: 0.5065 Epoch 28/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1787 - mae_euclidean: 0.7831 - jacard_coef: 0.6970 - val_loss: 0.3271 - val_mae_euclidean: 1.6469 - val_jacard_coef: 0.5071 Epoch 29/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1678 - mae_euclidean: 0.5705 - jacard_coef: 0.7127 - val_loss: 0.3281 - val_mae_euclidean: 1.6605 - val_jacard_coef: 0.5059 Epoch 30/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1681 - mae_euclidean: 0.5789 - jacard_coef: 0.7123 - val_loss: 0.3283 - val_mae_euclidean: 1.6877 - val_jacard_coef: 0.5057 Epoch 31/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1694 - mae_euclidean: 0.6383 - jacard_coef: 0.7103 - val_loss: 0.3325 - val_mae_euclidean: 1.6930 - val_jacard_coef: 0.5010 Epoch 32/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1760 - mae_euclidean: 0.6121 - jacard_coef: 0.7008 - val_loss: 0.3369 - val_mae_euclidean: 1.7180 - val_jacard_coef: 0.4960 Epoch 33/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1686 - mae_euclidean: 0.5580 - jacard_coef: 0.7115 - val_loss: 0.3409 - val_mae_euclidean: 1.7446 - val_jacard_coef: 0.4916 Epoch 34/100 4/4 [==============================] - 2s 487ms/step - loss: 0.1700 - mae_euclidean: 0.6017 - jacard_coef: 0.7095 - val_loss: 0.3396 - val_mae_euclidean: 1.7370 - val_jacard_coef: 0.4930 Epoch 35/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1828 - mae_euclidean: 0.8120 - jacard_coef: 0.6915 - val_loss: 0.3342 - val_mae_euclidean: 1.6876 - val_jacard_coef: 0.4990 Epoch 36/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1655 - mae_euclidean: 0.6011 - jacard_coef: 0.7161 - val_loss: 0.3320 - val_mae_euclidean: 1.6778 - val_jacard_coef: 0.5015 Epoch 37/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1659 - mae_euclidean: 0.5269 - jacard_coef: 0.7155 - val_loss: 0.3304 - val_mae_euclidean: 1.6710 - val_jacard_coef: 0.5033 Epoch 38/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1670 - mae_euclidean: 0.5522 - jacard_coef: 0.7140 - val_loss: 0.3315 - val_mae_euclidean: 1.6708 - val_jacard_coef: 0.5021 Epoch 39/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1652 - mae_euclidean: 0.5408 - jacard_coef: 0.7165 - val_loss: 0.3293 - val_mae_euclidean: 1.6531 - val_jacard_coef: 0.5045 Epoch 40/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1676 - mae_euclidean: 0.6855 - jacard_coef: 0.7130 - val_loss: 0.3303 - val_mae_euclidean: 1.6474 - val_jacard_coef: 0.5034 Epoch 41/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1680 - mae_euclidean: 0.6130 - jacard_coef: 0.7125 - val_loss: 0.3321 - val_mae_euclidean: 1.6357 - val_jacard_coef: 0.5014 Epoch 42/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1683 - mae_euclidean: 0.6119 - jacard_coef: 0.7119 - val_loss: 0.3340 - val_mae_euclidean: 1.6375 - val_jacard_coef: 0.4993 Epoch 43/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1648 - mae_euclidean: 0.5396 - jacard_coef: 0.7171 - val_loss: 0.3371 - val_mae_euclidean: 1.6538 - val_jacard_coef: 0.4958 Epoch 44/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1673 - mae_euclidean: 0.6650 - jacard_coef: 0.7136 - val_loss: 0.3343 - val_mae_euclidean: 1.6512 - val_jacard_coef: 0.4989 Epoch 45/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1656 - mae_euclidean: 0.6421 - jacard_coef: 0.7160 - val_loss: 0.3302 - val_mae_euclidean: 1.6247 - val_jacard_coef: 0.5036 Epoch 46/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1738 - mae_euclidean: 0.6574 - jacard_coef: 0.7041 - val_loss: 0.3302 - val_mae_euclidean: 1.6432 - val_jacard_coef: 0.5035 Epoch 47/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1617 - mae_euclidean: 0.5518 - jacard_coef: 0.7217 - val_loss: 0.3346 - val_mae_euclidean: 1.6579 - val_jacard_coef: 0.4985 Epoch 48/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1634 - mae_euclidean: 0.6316 - jacard_coef: 0.7191 - val_loss: 0.3344 - val_mae_euclidean: 1.7015 - val_jacard_coef: 0.4988 Epoch 49/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1652 - mae_euclidean: 0.5391 - jacard_coef: 0.7165 - val_loss: 0.3394 - val_mae_euclidean: 1.7382 - val_jacard_coef: 0.4932 Epoch 50/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1638 - mae_euclidean: 0.6198 - jacard_coef: 0.7186 - val_loss: 0.3398 - val_mae_euclidean: 1.7354 - val_jacard_coef: 0.4927 Epoch 51/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1671 - mae_euclidean: 0.6439 - jacard_coef: 0.7138 - val_loss: 0.3354 - val_mae_euclidean: 1.6973 - val_jacard_coef: 0.4977 Epoch 52/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1622 - mae_euclidean: 0.5829 - jacard_coef: 0.7210 - val_loss: 0.3277 - val_mae_euclidean: 1.6351 - val_jacard_coef: 0.5063 Epoch 53/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1643 - mae_euclidean: 0.5437 - jacard_coef: 0.7178 - val_loss: 0.3281 - val_mae_euclidean: 1.6529 - val_jacard_coef: 0.5059 Epoch 54/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1624 - mae_euclidean: 0.6079 - jacard_coef: 0.7206 - val_loss: 0.3317 - val_mae_euclidean: 1.6725 - val_jacard_coef: 0.5019 Epoch 55/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1611 - mae_euclidean: 0.5299 - jacard_coef: 0.7225 - val_loss: 0.3311 - val_mae_euclidean: 1.6817 - val_jacard_coef: 0.5026 Epoch 56/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1640 - mae_euclidean: 0.6048 - jacard_coef: 0.7183 - val_loss: 0.3305 - val_mae_euclidean: 1.5832 - val_jacard_coef: 0.5031 Epoch 57/100 4/4 [==============================] - 2s 492ms/step - loss: 0.1614 - mae_euclidean: 0.5357 - jacard_coef: 0.7221 - val_loss: 0.3312 - val_mae_euclidean: 1.5755 - val_jacard_coef: 0.5024 Epoch 58/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1636 - mae_euclidean: 0.5152 - jacard_coef: 0.7188 - val_loss: 0.3307 - val_mae_euclidean: 1.6415 - val_jacard_coef: 0.5030 Epoch 59/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1625 - mae_euclidean: 0.6006 - jacard_coef: 0.7204 - val_loss: 0.3329 - val_mae_euclidean: 1.6043 - val_jacard_coef: 0.5005 Epoch 60/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1701 - mae_euclidean: 0.6786 - jacard_coef: 0.7093 - val_loss: 0.3339 - val_mae_euclidean: 1.6569 - val_jacard_coef: 0.4994 Epoch 61/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1608 - mae_euclidean: 0.5383 - jacard_coef: 0.7230 - val_loss: 0.3326 - val_mae_euclidean: 1.5860 - val_jacard_coef: 0.5009 Epoch 62/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1625 - mae_euclidean: 0.6031 - jacard_coef: 0.7204 - val_loss: 0.3299 - val_mae_euclidean: 1.6330 - val_jacard_coef: 0.5039 Epoch 63/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1702 - mae_euclidean: 0.7025 - jacard_coef: 0.7092 - val_loss: 0.3319 - val_mae_euclidean: 1.6991 - val_jacard_coef: 0.5016 Epoch 64/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1658 - mae_euclidean: 0.6919 - jacard_coef: 0.7157 - val_loss: 0.3374 - val_mae_euclidean: 1.7403 - val_jacard_coef: 0.4954 Epoch 65/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1592 - mae_euclidean: 0.5489 - jacard_coef: 0.7253 - val_loss: 0.3287 - val_mae_euclidean: 1.5947 - val_jacard_coef: 0.5053 Epoch 66/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1712 - mae_euclidean: 0.6109 - jacard_coef: 0.7077 - val_loss: 0.3244 - val_mae_euclidean: 1.5488 - val_jacard_coef: 0.5101 Epoch 67/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1622 - mae_euclidean: 0.5683 - jacard_coef: 0.7210 - val_loss: 0.3290 - val_mae_euclidean: 1.5654 - val_jacard_coef: 0.5049 Epoch 68/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1632 - mae_euclidean: 0.5779 - jacard_coef: 0.7194 - val_loss: 0.3320 - val_mae_euclidean: 1.6453 - val_jacard_coef: 0.5015 Epoch 69/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1605 - mae_euclidean: 0.5989 - jacard_coef: 0.7236 - val_loss: 0.3274 - val_mae_euclidean: 1.6040 - val_jacard_coef: 0.5067 Epoch 70/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1585 - mae_euclidean: 0.5308 - jacard_coef: 0.7264 - val_loss: 0.3276 - val_mae_euclidean: 1.2616 - val_jacard_coef: 0.5065 Epoch 71/100 4/4 [==============================] - 2s 488ms/step - loss: 0.1627 - mae_euclidean: 0.6608 - jacard_coef: 0.7201 - val_loss: 0.3300 - val_mae_euclidean: 1.3000 - val_jacard_coef: 0.5038 Epoch 72/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1629 - mae_euclidean: 0.6057 - jacard_coef: 0.7199 - val_loss: 0.3339 - val_mae_euclidean: 1.3236 - val_jacard_coef: 0.4994 Epoch 73/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1591 - mae_euclidean: 0.4863 - jacard_coef: 0.7254 - val_loss: 0.3352 - val_mae_euclidean: 1.5737 - val_jacard_coef: 0.4979 Epoch 74/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1641 - mae_euclidean: 0.6466 - jacard_coef: 0.7181 - val_loss: 0.3343 - val_mae_euclidean: 1.5730 - val_jacard_coef: 0.4989 Epoch 75/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1578 - mae_euclidean: 0.5578 - jacard_coef: 0.7274 - val_loss: 0.3339 - val_mae_euclidean: 1.5794 - val_jacard_coef: 0.4994 Epoch 76/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1603 - mae_euclidean: 0.5632 - jacard_coef: 0.7238 - val_loss: 0.3339 - val_mae_euclidean: 1.5576 - val_jacard_coef: 0.4994 Epoch 77/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1651 - mae_euclidean: 0.7274 - jacard_coef: 0.7167 - val_loss: 0.3382 - val_mae_euclidean: 1.6457 - val_jacard_coef: 0.4946 Epoch 78/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1633 - mae_euclidean: 0.6357 - jacard_coef: 0.7193 - val_loss: 0.3368 - val_mae_euclidean: 1.6514 - val_jacard_coef: 0.4962 Epoch 79/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1599 - mae_euclidean: 0.5965 - jacard_coef: 0.7243 - val_loss: 0.3355 - val_mae_euclidean: 1.6587 - val_jacard_coef: 0.4976 Epoch 80/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1627 - mae_euclidean: 0.6293 - jacard_coef: 0.7202 - val_loss: 0.3343 - val_mae_euclidean: 1.3853 - val_jacard_coef: 0.4990 Epoch 81/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1627 - mae_euclidean: 0.5265 - jacard_coef: 0.7201 - val_loss: 0.3315 - val_mae_euclidean: 1.3670 - val_jacard_coef: 0.5021 Epoch 82/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1592 - mae_euclidean: 0.6073 - jacard_coef: 0.7254 - val_loss: 0.3314 - val_mae_euclidean: 1.3359 - val_jacard_coef: 0.5022 Epoch 83/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1605 - mae_euclidean: 0.5758 - jacard_coef: 0.7235 - val_loss: 0.3299 - val_mae_euclidean: 1.3340 - val_jacard_coef: 0.5039 Epoch 84/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1598 - mae_euclidean: 0.5041 - jacard_coef: 0.7246 - val_loss: 0.3248 - val_mae_euclidean: 1.3433 - val_jacard_coef: 0.5097 Epoch 85/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1602 - mae_euclidean: 0.5661 - jacard_coef: 0.7238 - val_loss: 0.3299 - val_mae_euclidean: 1.2402 - val_jacard_coef: 0.5038 Epoch 86/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1613 - mae_euclidean: 0.5440 - jacard_coef: 0.7223 - val_loss: 0.3357 - val_mae_euclidean: 1.3729 - val_jacard_coef: 0.4974 Epoch 87/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1588 - mae_euclidean: 0.5525 - jacard_coef: 0.7260 - val_loss: 0.3378 - val_mae_euclidean: 1.6419 - val_jacard_coef: 0.4950 Epoch 88/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1581 - mae_euclidean: 0.5868 - jacard_coef: 0.7271 - val_loss: 0.3375 - val_mae_euclidean: 1.6283 - val_jacard_coef: 0.4953 Epoch 89/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1621 - mae_euclidean: 0.6886 - jacard_coef: 0.7212 - val_loss: 0.3328 - val_mae_euclidean: 1.6288 - val_jacard_coef: 0.5006 Epoch 90/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1575 - mae_euclidean: 0.5595 - jacard_coef: 0.7279 - val_loss: 0.3282 - val_mae_euclidean: 1.6314 - val_jacard_coef: 0.5058 Epoch 91/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1584 - mae_euclidean: 0.5733 - jacard_coef: 0.7266 - val_loss: 0.3271 - val_mae_euclidean: 1.6277 - val_jacard_coef: 0.5070 Epoch 92/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1570 - mae_euclidean: 0.6121 - jacard_coef: 0.7286 - val_loss: 0.3324 - val_mae_euclidean: 1.6347 - val_jacard_coef: 0.5011 Epoch 93/100 4/4 [==============================] - 2s 491ms/step - loss: 0.1619 - mae_euclidean: 0.5987 - jacard_coef: 0.7215 - val_loss: 0.3365 - val_mae_euclidean: 1.6368 - val_jacard_coef: 0.4964 Epoch 94/100 4/4 [==============================] - 2s 489ms/step - loss: 0.1597 - mae_euclidean: 0.6526 - jacard_coef: 0.7247 - val_loss: 0.3316 - val_mae_euclidean: 1.6240 - val_jacard_coef: 0.5019 Epoch 95/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1702 - mae_euclidean: 0.6795 - jacard_coef: 0.7100 - val_loss: 0.3245 - val_mae_euclidean: 1.6218 - val_jacard_coef: 0.5100 Epoch 96/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1579 - mae_euclidean: 0.5053 - jacard_coef: 0.7273 - val_loss: 0.3277 - val_mae_euclidean: 1.6242 - val_jacard_coef: 0.5063 Epoch 97/100 4/4 [==============================] - 2s 490ms/step - loss: 0.1596 - mae_euclidean: 0.5154 - jacard_coef: 0.7247 - val_loss: 0.3286 - val_mae_euclidean: 1.6347 - val_jacard_coef: 0.5054 Epoch 98/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1579 - mae_euclidean: 0.5974 - jacard_coef: 0.7273 - val_loss: 0.3276 - val_mae_euclidean: 1.6448 - val_jacard_coef: 0.5065 Epoch 99/100 4/4 [==============================] - 2s 493ms/step - loss: 0.1572 - mae_euclidean: 0.5393 - jacard_coef: 0.7283 - val_loss: 0.3305 - val_mae_euclidean: 1.6487 - val_jacard_coef: 0.5032 Epoch 100/100 4/4 [==============================] - 2s 494ms/step - loss: 0.1537 - mae_euclidean: 0.5503 - jacard_coef: 0.7336 - val_loss: 0.3305 - val_mae_euclidean: 1.6478 - val_jacard_coef: 0.5032
plot_history(history_sa_unet2a, "SA Unet2a")
No further improvement achieved with the smaller learning rate. No significant over fitting.
model_sa_unet2a.evaluate(X_val, y_val)
1/1 [==============================] - 0s 350ms/step - loss: 0.3305 - mae_euclidean: 1.6478 - jacard_coef: 0.5032
[0.33050423860549927, 1.6478126049041748, 0.5031962394714355]
Focal loss function is commonly used for segmentation task. Below a hyper parameters search is performed to find the best values for gamma and the weight for the positive class. Comparison with dice loss will be made.
# finding the optimal focal loss parameters
def model_builder_SA_UNet_2(hp):
model = SA_UNet(input_shape, block_size=25, keep_prob=0.8, start_neurons=20)
hp_gamma = hp.Float('gamma', min_value=0, max_value=5, step=1) # gama 0 is crossentropy
hp_poss_weight = hp.Float('pos_weight', min_value=1, max_value=3, step=0.5)
model.compile(optimizer = Adam(learning_rate = 1e-2),
loss = BinaryFocalLoss(gamma=hp_gamma, pos_weight=hp_poss_weight),
metrics=[mae_euclidean, jacard_coef])
return model
tuner2 = kt.BayesianOptimization(model_builder_SA_UNet_2,
objective = kt.Objective("val_mae_euclidean", direction='min'),
max_trials=12,
seed = 14,
directory=os.path.normpath('C:/keras_tuner'),
project_name='2022-02-16_SA_UNet_tune2')
INFO:tensorflow:Reloading Oracle from existing project C:\keras_tuner\2022-02-16_SA_UNet_tune2\oracle.json INFO:tensorflow:Reloading Tuner from C:\keras_tuner\2022-02-16_SA_UNet_tune2\tuner0.json
tf.random.set_seed(14)
tuner2.search(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8,
epochs=200
)
Trial 13 Complete [00h 06m 38s] val_mae_euclidean: 1.0857945680618286 Best val_mae_euclidean So Far: 0.9302276968955994 Total elapsed time: 00h 13m 21s INFO:tensorflow:Oracle triggered exit
tuner2.results_summary(num_trials=8)
Results summary Results in C:\keras_tuner\2022-02-16_SA_UNet_tune2 Showing 8 best trials Objective(name='val_mae_euclidean', direction='min') Trial summary Hyperparameters: gamma: 0.0 pos_weight: 1.0 Score: 0.9302276968955994 Trial summary Hyperparameters: gamma: 0.0 pos_weight: 1.0 Score: 0.9340724945068359 Trial summary Hyperparameters: gamma: 0.0 pos_weight: 1.0 Score: 0.9897882342338562 Trial summary Hyperparameters: gamma: 2.0 pos_weight: 2.0 Score: 0.9955947399139404 Trial summary Hyperparameters: gamma: 1.0 pos_weight: 3.0 Score: 1.0241363048553467 Trial summary Hyperparameters: gamma: 0.0 pos_weight: 3.0 Score: 1.0857945680618286 Trial summary Hyperparameters: gamma: 0.0 pos_weight: 1.0 Score: 1.1025344133377075 Trial summary Hyperparameters: gamma: 5.0 pos_weight: 3.0 Score: 1.1217049360275269
The best metric is achieved with gamma=0 and pos_weight=1. This is essentially binary cross entropy. Top 5 models have similar performance.
# SA Unet3
# changing the loss from dice loss to binary cross entropy
tf.random.set_seed(14)
model_sa_unet3 = SA_UNet(input_shape, block_size=25, keep_prob=0.8, start_neurons=20)
model_sa_unet3.compile(optimizer=Adam(learning_rate=1e-2),
loss = "BinaryCrossentropy",
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_sa_unet3 = model_sa_unet3.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8, # no resources for 16
epochs=200
)
print(f"Model trained for {time.time() - start}s")
model_sa_unet3.save("2022-02-17 SA-UNet3 200epochs.hdf5")
Epoch 1/200 4/4 [==============================] - 6s 591ms/step - loss: 0.6256 - mae_euclidean: 6.5440 - jacard_coef: 0.0696 - val_loss: 35092.6523 - val_mae_euclidean: 6.1062 - val_jacard_coef: 0.0738 Epoch 2/200 4/4 [==============================] - 2s 496ms/step - loss: 0.4085 - mae_euclidean: 20.9610 - jacard_coef: 0.0680 - val_loss: 162299.3281 - val_mae_euclidean: 5.7550 - val_jacard_coef: 0.0660 Epoch 3/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3270 - mae_euclidean: 1152921435887370240.0000 - jacard_coef: 0.0676 - val_loss: 26849.3926 - val_mae_euclidean: 5.8864 - val_jacard_coef: 0.0704 Epoch 4/200 4/4 [==============================] - 2s 494ms/step - loss: 0.2855 - mae_euclidean: 61.1282 - jacard_coef: 0.0700 - val_loss: 1009.2335 - val_mae_euclidean: 5.8648 - val_jacard_coef: 0.0360 Epoch 5/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2583 - mae_euclidean: 33.7621 - jacard_coef: 0.0815 - val_loss: 236.0236 - val_mae_euclidean: 71.5303 - val_jacard_coef: 0.0019 Epoch 6/200 4/4 [==============================] - 2s 494ms/step - loss: 0.2387 - mae_euclidean: 22.4661 - jacard_coef: 0.0954 - val_loss: 84.0318 - val_mae_euclidean: 87.3181 - val_jacard_coef: 6.4718e-04 Epoch 7/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2291 - mae_euclidean: 19.2555 - jacard_coef: 0.1027 - val_loss: 27.0562 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0023 Epoch 8/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2213 - mae_euclidean: 19.8906 - jacard_coef: 0.1094 - val_loss: 1.1530 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0056 Epoch 9/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2143 - mae_euclidean: 17.8027 - jacard_coef: 0.1174 - val_loss: 0.6390 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0029 Epoch 10/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2036 - mae_euclidean: 14.2079 - jacard_coef: 0.1322 - val_loss: 0.6827 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.9929e-04 Epoch 11/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1926 - mae_euclidean: 10.4539 - jacard_coef: 0.1454 - val_loss: 0.8109 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.6610e-04 Epoch 12/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1842 - mae_euclidean: 6.5181 - jacard_coef: 0.1661 - val_loss: 0.6768 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.1639e-04 Epoch 13/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1783 - mae_euclidean: 5.5300 - jacard_coef: 0.1757 - val_loss: 0.4896 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0025 Epoch 14/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1746 - mae_euclidean: 5.7167 - jacard_coef: 0.1774 - val_loss: 0.5148 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0030 Epoch 15/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1689 - mae_euclidean: 3.8468 - jacard_coef: 0.1980 - val_loss: 0.6478 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.9320e-04 Epoch 16/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1653 - mae_euclidean: 2.9560 - jacard_coef: 0.2070 - val_loss: 0.8067 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1102e-04 Epoch 17/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1620 - mae_euclidean: 2.5691 - jacard_coef: 0.2181 - val_loss: 0.8735 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.6140e-05 Epoch 18/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1596 - mae_euclidean: 2.5524 - jacard_coef: 0.2226 - val_loss: 0.8733 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.5849e-05 Epoch 19/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1582 - mae_euclidean: 2.2318 - jacard_coef: 0.2303 - val_loss: 0.9001 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0472e-05 Epoch 20/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1560 - mae_euclidean: 1.9863 - jacard_coef: 0.2357 - val_loss: 0.8603 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.7046e-05 Epoch 21/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1528 - mae_euclidean: 2.1065 - jacard_coef: 0.2343 - val_loss: 0.8769 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.3658e-05 Epoch 22/200 4/4 [==============================] - 2s 500ms/step - loss: 0.1513 - mae_euclidean: 1.7357 - jacard_coef: 0.2523 - val_loss: 0.9228 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.7461e-05 Epoch 23/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1534 - mae_euclidean: 1.9145 - jacard_coef: 0.2430 - val_loss: 0.8452 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.1439e-05 Epoch 24/200 4/4 [==============================] - 2s 499ms/step - loss: 0.1502 - mae_euclidean: 1.9503 - jacard_coef: 0.2473 - val_loss: 0.8652 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.6633e-05 Epoch 25/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1494 - mae_euclidean: 1.6708 - jacard_coef: 0.2609 - val_loss: 0.8339 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.4307e-05 Epoch 26/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1479 - mae_euclidean: 1.6500 - jacard_coef: 0.2618 - val_loss: 0.7938 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.9794e-05 Epoch 27/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1474 - mae_euclidean: 1.8681 - jacard_coef: 0.2560 - val_loss: 0.8276 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.6099e-05 Epoch 28/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1488 - mae_euclidean: 1.7329 - jacard_coef: 0.2666 - val_loss: 0.8085 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.3238e-05 Epoch 29/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1454 - mae_euclidean: 1.6367 - jacard_coef: 0.2644 - val_loss: 0.7405 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1393e-04 Epoch 30/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1455 - mae_euclidean: 1.5351 - jacard_coef: 0.2696 - val_loss: 0.8083 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.4770e-05 Epoch 31/200 4/4 [==============================] - 2s 499ms/step - loss: 0.1441 - mae_euclidean: 1.4306 - jacard_coef: 0.2753 - val_loss: 0.7572 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.0253e-04 Epoch 32/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1450 - mae_euclidean: 1.5643 - jacard_coef: 0.2670 - val_loss: 0.7291 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.2468e-04 Epoch 33/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1421 - mae_euclidean: 1.5171 - jacard_coef: 0.2764 - val_loss: 0.7381 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1778e-04 Epoch 34/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1413 - mae_euclidean: 1.4653 - jacard_coef: 0.2799 - val_loss: 0.6863 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.8335e-04 Epoch 35/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1450 - mae_euclidean: 1.6530 - jacard_coef: 0.2726 - val_loss: 0.6855 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.8147e-04 Epoch 36/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1395 - mae_euclidean: 1.3924 - jacard_coef: 0.2810 - val_loss: 0.6970 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.6828e-04 Epoch 37/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1382 - mae_euclidean: 1.3452 - jacard_coef: 0.2892 - val_loss: 0.6983 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.6702e-04 Epoch 38/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1386 - mae_euclidean: 1.3330 - jacard_coef: 0.2876 - val_loss: 0.6811 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.9396e-04 Epoch 39/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1376 - mae_euclidean: 1.3754 - jacard_coef: 0.2913 - val_loss: 0.6487 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.6758e-04 Epoch 40/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1382 - mae_euclidean: 1.3759 - jacard_coef: 0.2929 - val_loss: 0.6660 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.3080e-04 Epoch 41/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1365 - mae_euclidean: 1.2951 - jacard_coef: 0.2948 - val_loss: 0.7106 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.6293e-04 Epoch 42/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1378 - mae_euclidean: 1.4395 - jacard_coef: 0.2895 - val_loss: 0.6941 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.9579e-04 Epoch 43/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1351 - mae_euclidean: 1.3715 - jacard_coef: 0.2999 - val_loss: 0.7219 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.5596e-04 Epoch 44/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1357 - mae_euclidean: 1.3668 - jacard_coef: 0.2990 - val_loss: 0.6410 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.2143e-04 Epoch 45/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1351 - mae_euclidean: 1.3911 - jacard_coef: 0.2940 - val_loss: 0.6591 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.7332e-04 Epoch 46/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1365 - mae_euclidean: 1.4248 - jacard_coef: 0.2916 - val_loss: 0.7265 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.6368e-04 Epoch 47/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1337 - mae_euclidean: 1.2800 - jacard_coef: 0.3037 - val_loss: 0.7051 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.9245e-04 Epoch 48/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1335 - mae_euclidean: 1.2185 - jacard_coef: 0.3071 - val_loss: 0.6232 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0394e-04 Epoch 49/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1339 - mae_euclidean: 1.2195 - jacard_coef: 0.3013 - val_loss: 0.5719 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.3627e-04 Epoch 50/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1322 - mae_euclidean: 1.2003 - jacard_coef: 0.3083 - val_loss: 0.5320 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0012 Epoch 51/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1335 - mae_euclidean: 1.3394 - jacard_coef: 0.3024 - val_loss: 0.5679 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.1619e-04 Epoch 52/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1307 - mae_euclidean: 1.1921 - jacard_coef: 0.3118 - val_loss: 0.5757 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.5939e-04 Epoch 53/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1318 - mae_euclidean: 1.2701 - jacard_coef: 0.3088 - val_loss: 0.5596 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.2407e-04 Epoch 54/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1304 - mae_euclidean: 1.3056 - jacard_coef: 0.3153 - val_loss: 0.5023 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0017 Epoch 55/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1298 - mae_euclidean: 1.2364 - jacard_coef: 0.3136 - val_loss: 0.5672 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.4360e-04 Epoch 56/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1311 - mae_euclidean: 1.2929 - jacard_coef: 0.3122 - val_loss: 0.5044 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 0.0019 Epoch 57/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1296 - mae_euclidean: 1.2451 - jacard_coef: 0.3197 - val_loss: 0.5082 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0019 Epoch 58/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1291 - mae_euclidean: 1.2361 - jacard_coef: 0.3170 - val_loss: 0.4965 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0025 Epoch 59/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1305 - mae_euclidean: 1.2496 - jacard_coef: 0.3161 - val_loss: 0.4665 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0034 Epoch 60/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1339 - mae_euclidean: 1.2908 - jacard_coef: 0.3113 - val_loss: 0.4385 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0056 Epoch 61/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1302 - mae_euclidean: 1.2032 - jacard_coef: 0.3177 - val_loss: 0.4834 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 0.0046 Epoch 62/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1317 - mae_euclidean: 1.2970 - jacard_coef: 0.3141 - val_loss: 0.4027 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0112 Epoch 63/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1324 - mae_euclidean: 1.3576 - jacard_coef: 0.3093 - val_loss: 0.4244 - val_mae_euclidean: 9223371487098961920.0000 - val_jacard_coef: 0.0085 Epoch 64/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1322 - mae_euclidean: 1.3570 - jacard_coef: 0.3105 - val_loss: 0.4026 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 0.0111 Epoch 65/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1290 - mae_euclidean: 1.2493 - jacard_coef: 0.3168 - val_loss: 0.4697 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0057 Epoch 66/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1298 - mae_euclidean: 1.2664 - jacard_coef: 0.3106 - val_loss: 0.4673 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0056 Epoch 67/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1276 - mae_euclidean: 1.1789 - jacard_coef: 0.3238 - val_loss: 0.4268 - val_mae_euclidean: 6917528477885267968.0000 - val_jacard_coef: 0.0095 Epoch 68/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1289 - mae_euclidean: 1.2667 - jacard_coef: 0.3241 - val_loss: 0.4434 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 0.0064 Epoch 69/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1258 - mae_euclidean: 1.2569 - jacard_coef: 0.3307 - val_loss: 0.4766 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0083 Epoch 70/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1249 - mae_euclidean: 1.2185 - jacard_coef: 0.3291 - val_loss: 0.4475 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0086 Epoch 71/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1258 - mae_euclidean: 1.3262 - jacard_coef: 0.3231 - val_loss: 0.4468 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 0.0091 Epoch 72/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1255 - mae_euclidean: 1.2050 - jacard_coef: 0.3346 - val_loss: 0.4421 - val_mae_euclidean: 79.8886 - val_jacard_coef: 0.0121 Epoch 73/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1245 - mae_euclidean: 1.1538 - jacard_coef: 0.3358 - val_loss: 0.4432 - val_mae_euclidean: 73.1379 - val_jacard_coef: 0.0113 Epoch 74/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1261 - mae_euclidean: 1.3606 - jacard_coef: 0.3291 - val_loss: 0.4343 - val_mae_euclidean: 42.5144 - val_jacard_coef: 0.0186 Epoch 75/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1229 - mae_euclidean: 1.2249 - jacard_coef: 0.3370 - val_loss: 0.4883 - val_mae_euclidean: 57.2914 - val_jacard_coef: 0.0113 Epoch 76/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1218 - mae_euclidean: 1.2004 - jacard_coef: 0.3379 - val_loss: 0.4237 - val_mae_euclidean: 34.7500 - val_jacard_coef: 0.0232 Epoch 77/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1225 - mae_euclidean: 1.3101 - jacard_coef: 0.3366 - val_loss: 0.4497 - val_mae_euclidean: 31.9013 - val_jacard_coef: 0.0193 Epoch 78/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1256 - mae_euclidean: 1.2751 - jacard_coef: 0.3374 - val_loss: 0.4445 - val_mae_euclidean: 57.8290 - val_jacard_coef: 0.0128 Epoch 79/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1238 - mae_euclidean: 1.2230 - jacard_coef: 0.3379 - val_loss: 0.3439 - val_mae_euclidean: 16.3804 - val_jacard_coef: 0.0602 Epoch 80/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1241 - mae_euclidean: 1.3246 - jacard_coef: 0.3387 - val_loss: 0.3433 - val_mae_euclidean: 26.0244 - val_jacard_coef: 0.0405 Epoch 81/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1226 - mae_euclidean: 1.2095 - jacard_coef: 0.3365 - val_loss: 0.3949 - val_mae_euclidean: 17.8075 - val_jacard_coef: 0.0406 Epoch 82/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1202 - mae_euclidean: 1.1927 - jacard_coef: 0.3469 - val_loss: 0.3565 - val_mae_euclidean: 27.1684 - val_jacard_coef: 0.0368 Epoch 83/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1212 - mae_euclidean: 1.1469 - jacard_coef: 0.3424 - val_loss: 0.3986 - val_mae_euclidean: 21.6943 - val_jacard_coef: 0.0410 Epoch 84/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1221 - mae_euclidean: 1.1009 - jacard_coef: 0.3485 - val_loss: 0.3873 - val_mae_euclidean: 16.7349 - val_jacard_coef: 0.0420 Epoch 85/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1224 - mae_euclidean: 1.1551 - jacard_coef: 0.3354 - val_loss: 0.3954 - val_mae_euclidean: 20.0871 - val_jacard_coef: 0.0455 Epoch 86/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1192 - mae_euclidean: 1.1334 - jacard_coef: 0.3516 - val_loss: 0.4551 - val_mae_euclidean: 17.0372 - val_jacard_coef: 0.0422 Epoch 87/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1183 - mae_euclidean: 1.1576 - jacard_coef: 0.3546 - val_loss: 0.3272 - val_mae_euclidean: 12.0377 - val_jacard_coef: 0.0772 Epoch 88/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1184 - mae_euclidean: 1.1584 - jacard_coef: 0.3500 - val_loss: 0.4292 - val_mae_euclidean: 17.9212 - val_jacard_coef: 0.0528 Epoch 89/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1204 - mae_euclidean: 1.2424 - jacard_coef: 0.3572 - val_loss: 0.3342 - val_mae_euclidean: 11.3533 - val_jacard_coef: 0.0860 Epoch 90/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1210 - mae_euclidean: 1.2879 - jacard_coef: 0.3446 - val_loss: 0.2918 - val_mae_euclidean: 10.4074 - val_jacard_coef: 0.0982 Epoch 91/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1195 - mae_euclidean: 1.1601 - jacard_coef: 0.3553 - val_loss: 0.4155 - val_mae_euclidean: 11.7489 - val_jacard_coef: 0.0784 Epoch 92/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1175 - mae_euclidean: 1.0857 - jacard_coef: 0.3653 - val_loss: 0.3199 - val_mae_euclidean: 9.5442 - val_jacard_coef: 0.1023 Epoch 93/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1205 - mae_euclidean: 1.1450 - jacard_coef: 0.3416 - val_loss: 0.3770 - val_mae_euclidean: 10.9272 - val_jacard_coef: 0.0943 Epoch 94/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1191 - mae_euclidean: 1.1603 - jacard_coef: 0.3571 - val_loss: 0.3265 - val_mae_euclidean: 8.3780 - val_jacard_coef: 0.1219 Epoch 95/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1240 - mae_euclidean: 1.3945 - jacard_coef: 0.3415 - val_loss: 0.2916 - val_mae_euclidean: 8.8010 - val_jacard_coef: 0.1044 Epoch 96/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1185 - mae_euclidean: 1.1538 - jacard_coef: 0.3476 - val_loss: 0.3570 - val_mae_euclidean: 7.8159 - val_jacard_coef: 0.1229 Epoch 97/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1166 - mae_euclidean: 1.1078 - jacard_coef: 0.3658 - val_loss: 0.3150 - val_mae_euclidean: 7.4873 - val_jacard_coef: 0.1279 Epoch 98/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1174 - mae_euclidean: 1.2434 - jacard_coef: 0.3550 - val_loss: 0.2906 - val_mae_euclidean: 6.7495 - val_jacard_coef: 0.1539 Epoch 99/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1168 - mae_euclidean: 1.0306 - jacard_coef: 0.3614 - val_loss: 0.3051 - val_mae_euclidean: 6.7526 - val_jacard_coef: 0.1444 Epoch 100/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1149 - mae_euclidean: 1.1063 - jacard_coef: 0.3703 - val_loss: 0.2449 - val_mae_euclidean: 5.7634 - val_jacard_coef: 0.1725 Epoch 101/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1173 - mae_euclidean: 1.1071 - jacard_coef: 0.3558 - val_loss: 0.2976 - val_mae_euclidean: 6.7064 - val_jacard_coef: 0.1566 Epoch 102/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1146 - mae_euclidean: 1.0172 - jacard_coef: 0.3712 - val_loss: 0.2904 - val_mae_euclidean: 6.6576 - val_jacard_coef: 0.1585 Epoch 103/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1135 - mae_euclidean: 1.1629 - jacard_coef: 0.3664 - val_loss: 0.2411 - val_mae_euclidean: 5.5556 - val_jacard_coef: 0.2120 Epoch 104/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1144 - mae_euclidean: 1.0767 - jacard_coef: 0.3681 - val_loss: 0.2775 - val_mae_euclidean: 6.4200 - val_jacard_coef: 0.1735 Epoch 105/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1122 - mae_euclidean: 1.0358 - jacard_coef: 0.3745 - val_loss: 0.2394 - val_mae_euclidean: 5.1682 - val_jacard_coef: 0.2261 Epoch 106/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1130 - mae_euclidean: 1.0866 - jacard_coef: 0.3763 - val_loss: 0.2247 - val_mae_euclidean: 5.7362 - val_jacard_coef: 0.2084 Epoch 107/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1131 - mae_euclidean: 1.0414 - jacard_coef: 0.3718 - val_loss: 0.2147 - val_mae_euclidean: 4.9684 - val_jacard_coef: 0.2399 Epoch 108/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1139 - mae_euclidean: 1.0427 - jacard_coef: 0.3731 - val_loss: 0.2487 - val_mae_euclidean: 5.5923 - val_jacard_coef: 0.2237 Epoch 109/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1150 - mae_euclidean: 1.0623 - jacard_coef: 0.3697 - val_loss: 0.1770 - val_mae_euclidean: 3.8719 - val_jacard_coef: 0.2750 Epoch 110/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1106 - mae_euclidean: 1.0860 - jacard_coef: 0.3698 - val_loss: 0.1998 - val_mae_euclidean: 4.1074 - val_jacard_coef: 0.2627 Epoch 111/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1091 - mae_euclidean: 0.9676 - jacard_coef: 0.3857 - val_loss: 0.2509 - val_mae_euclidean: 5.1107 - val_jacard_coef: 0.2366 Epoch 112/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1107 - mae_euclidean: 1.1334 - jacard_coef: 0.3821 - val_loss: 0.2157 - val_mae_euclidean: 4.3528 - val_jacard_coef: 0.2523 Epoch 113/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1090 - mae_euclidean: 1.0095 - jacard_coef: 0.3810 - val_loss: 0.2191 - val_mae_euclidean: 4.6859 - val_jacard_coef: 0.2465 Epoch 114/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1097 - mae_euclidean: 1.0491 - jacard_coef: 0.3836 - val_loss: 0.2197 - val_mae_euclidean: 4.3662 - val_jacard_coef: 0.2671 Epoch 115/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1079 - mae_euclidean: 0.9549 - jacard_coef: 0.3934 - val_loss: 0.2259 - val_mae_euclidean: 4.7127 - val_jacard_coef: 0.2434 Epoch 116/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1066 - mae_euclidean: 0.9700 - jacard_coef: 0.3930 - val_loss: 0.2095 - val_mae_euclidean: 3.8354 - val_jacard_coef: 0.2886 Epoch 117/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1078 - mae_euclidean: 0.9961 - jacard_coef: 0.3929 - val_loss: 0.2054 - val_mae_euclidean: 3.8933 - val_jacard_coef: 0.2794 Epoch 118/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1041 - mae_euclidean: 0.9643 - jacard_coef: 0.4019 - val_loss: 0.2346 - val_mae_euclidean: 4.5963 - val_jacard_coef: 0.2681 Epoch 119/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1062 - mae_euclidean: 0.9205 - jacard_coef: 0.4031 - val_loss: 0.2357 - val_mae_euclidean: 4.9118 - val_jacard_coef: 0.2544 Epoch 120/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1040 - mae_euclidean: 0.9568 - jacard_coef: 0.4056 - val_loss: 0.1926 - val_mae_euclidean: 3.6176 - val_jacard_coef: 0.3059 Epoch 121/200 4/4 [==============================] - 2s 496ms/step - loss: 0.1044 - mae_euclidean: 0.9520 - jacard_coef: 0.4051 - val_loss: 0.2040 - val_mae_euclidean: 3.9242 - val_jacard_coef: 0.2901 Epoch 122/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1033 - mae_euclidean: 0.8500 - jacard_coef: 0.4051 - val_loss: 0.2482 - val_mae_euclidean: 4.8183 - val_jacard_coef: 0.2547 Epoch 123/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1066 - mae_euclidean: 1.0709 - jacard_coef: 0.4021 - val_loss: 0.1979 - val_mae_euclidean: 3.8512 - val_jacard_coef: 0.2852 Epoch 124/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1041 - mae_euclidean: 0.9735 - jacard_coef: 0.4024 - val_loss: 0.1981 - val_mae_euclidean: 3.5437 - val_jacard_coef: 0.2956 Epoch 125/200 4/4 [==============================] - 2s 495ms/step - loss: 0.1023 - mae_euclidean: 0.9186 - jacard_coef: 0.4121 - val_loss: 0.1776 - val_mae_euclidean: 3.1697 - val_jacard_coef: 0.3157 Epoch 126/200 4/4 [==============================] - 2s 494ms/step - loss: 0.1057 - mae_euclidean: 1.1270 - jacard_coef: 0.3933 - val_loss: 0.2116 - val_mae_euclidean: 4.0107 - val_jacard_coef: 0.2855 Epoch 127/200 4/4 [==============================] - 2s 493ms/step - loss: 0.1047 - mae_euclidean: 0.8701 - jacard_coef: 0.4092 - val_loss: 0.2024 - val_mae_euclidean: 3.5487 - val_jacard_coef: 0.3003 Epoch 128/200 4/4 [==============================] - 2s 492ms/step - loss: 0.1033 - mae_euclidean: 1.0535 - jacard_coef: 0.4087 - val_loss: 0.1599 - val_mae_euclidean: 2.8998 - val_jacard_coef: 0.3574 Epoch 129/200 4/4 [==============================] - 2s 493ms/step - loss: 0.1038 - mae_euclidean: 0.9602 - jacard_coef: 0.4115 - val_loss: 0.2356 - val_mae_euclidean: 4.2757 - val_jacard_coef: 0.2652 Epoch 130/200 4/4 [==============================] - 2s 491ms/step - loss: 0.1011 - mae_euclidean: 0.8665 - jacard_coef: 0.4160 - val_loss: 0.1744 - val_mae_euclidean: 3.2847 - val_jacard_coef: 0.3337 Epoch 131/200 4/4 [==============================] - 2s 493ms/step - loss: 0.1012 - mae_euclidean: 0.8721 - jacard_coef: 0.4160 - val_loss: 0.2209 - val_mae_euclidean: 3.9448 - val_jacard_coef: 0.2711 Epoch 132/200 4/4 [==============================] - 2s 493ms/step - loss: 0.1036 - mae_euclidean: 0.9349 - jacard_coef: 0.4116 - val_loss: 0.1816 - val_mae_euclidean: 2.9636 - val_jacard_coef: 0.3319 Epoch 133/200 4/4 [==============================] - 2s 491ms/step - loss: 0.0998 - mae_euclidean: 0.8835 - jacard_coef: 0.4164 - val_loss: 0.1947 - val_mae_euclidean: 3.2206 - val_jacard_coef: 0.3220 Epoch 134/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0964 - mae_euclidean: 0.8091 - jacard_coef: 0.4271 - val_loss: 0.1882 - val_mae_euclidean: 3.1202 - val_jacard_coef: 0.3335 Epoch 135/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0959 - mae_euclidean: 0.7427 - jacard_coef: 0.4371 - val_loss: 0.1870 - val_mae_euclidean: 3.1744 - val_jacard_coef: 0.3328 Epoch 136/200 4/4 [==============================] - 2s 491ms/step - loss: 0.0955 - mae_euclidean: 0.8808 - jacard_coef: 0.4380 - val_loss: 0.1774 - val_mae_euclidean: 3.0453 - val_jacard_coef: 0.3511 Epoch 137/200 4/4 [==============================] - 2s 496ms/step - loss: 0.0942 - mae_euclidean: 0.7919 - jacard_coef: 0.4438 - val_loss: 0.2020 - val_mae_euclidean: 3.2806 - val_jacard_coef: 0.3282 Epoch 138/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0983 - mae_euclidean: 1.1193 - jacard_coef: 0.4301 - val_loss: 0.1728 - val_mae_euclidean: 2.9823 - val_jacard_coef: 0.3525 Epoch 139/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0933 - mae_euclidean: 0.7633 - jacard_coef: 0.4449 - val_loss: 0.2028 - val_mae_euclidean: 3.3662 - val_jacard_coef: 0.3205 Epoch 140/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0933 - mae_euclidean: 0.7365 - jacard_coef: 0.4485 - val_loss: 0.1571 - val_mae_euclidean: 2.7355 - val_jacard_coef: 0.3845 Epoch 141/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0967 - mae_euclidean: 1.2026 - jacard_coef: 0.4344 - val_loss: 0.1736 - val_mae_euclidean: 2.8638 - val_jacard_coef: 0.3508 Epoch 142/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0968 - mae_euclidean: 0.8426 - jacard_coef: 0.4381 - val_loss: 0.1844 - val_mae_euclidean: 3.0428 - val_jacard_coef: 0.3484 Epoch 143/200 4/4 [==============================] - 2s 496ms/step - loss: 0.0931 - mae_euclidean: 0.8085 - jacard_coef: 0.4486 - val_loss: 0.1586 - val_mae_euclidean: 2.2117 - val_jacard_coef: 0.3904 Epoch 144/200 4/4 [==============================] - 2s 496ms/step - loss: 0.0935 - mae_euclidean: 0.8520 - jacard_coef: 0.4499 - val_loss: 0.1906 - val_mae_euclidean: 3.0314 - val_jacard_coef: 0.3398 Epoch 145/200 4/4 [==============================] - 2s 497ms/step - loss: 0.0917 - mae_euclidean: 0.7687 - jacard_coef: 0.4529 - val_loss: 0.1659 - val_mae_euclidean: 2.6672 - val_jacard_coef: 0.3904 Epoch 146/200 4/4 [==============================] - 2s 496ms/step - loss: 0.0906 - mae_euclidean: 0.7637 - jacard_coef: 0.4575 - val_loss: 0.1639 - val_mae_euclidean: 2.7941 - val_jacard_coef: 0.3950 Epoch 147/200 4/4 [==============================] - 2s 496ms/step - loss: 0.0903 - mae_euclidean: 0.7626 - jacard_coef: 0.4649 - val_loss: 0.1463 - val_mae_euclidean: 2.1850 - val_jacard_coef: 0.4220 Epoch 148/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0913 - mae_euclidean: 1.0175 - jacard_coef: 0.4590 - val_loss: 0.1550 - val_mae_euclidean: 2.1106 - val_jacard_coef: 0.3945 Epoch 149/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0948 - mae_euclidean: 0.8760 - jacard_coef: 0.4512 - val_loss: 0.1562 - val_mae_euclidean: 2.2698 - val_jacard_coef: 0.3905 Epoch 150/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0912 - mae_euclidean: 0.8699 - jacard_coef: 0.4553 - val_loss: 0.1262 - val_mae_euclidean: 1.2687 - val_jacard_coef: 0.4507 Epoch 151/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0888 - mae_euclidean: 0.7689 - jacard_coef: 0.4700 - val_loss: 0.1559 - val_mae_euclidean: 2.1265 - val_jacard_coef: 0.3990 Epoch 152/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0886 - mae_euclidean: 0.8342 - jacard_coef: 0.4681 - val_loss: 0.1721 - val_mae_euclidean: 2.8104 - val_jacard_coef: 0.3899 Epoch 153/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0862 - mae_euclidean: 0.6932 - jacard_coef: 0.4774 - val_loss: 0.1278 - val_mae_euclidean: 1.3090 - val_jacard_coef: 0.4550 Epoch 154/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0864 - mae_euclidean: 0.7221 - jacard_coef: 0.4777 - val_loss: 0.1533 - val_mae_euclidean: 2.1849 - val_jacard_coef: 0.4097 Epoch 155/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0856 - mae_euclidean: 0.7242 - jacard_coef: 0.4838 - val_loss: 0.1646 - val_mae_euclidean: 2.6026 - val_jacard_coef: 0.4246 Epoch 156/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0861 - mae_euclidean: 0.6737 - jacard_coef: 0.4809 - val_loss: 0.1857 - val_mae_euclidean: 2.9316 - val_jacard_coef: 0.3799 Epoch 157/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0852 - mae_euclidean: 0.7138 - jacard_coef: 0.4788 - val_loss: 0.1426 - val_mae_euclidean: 1.4227 - val_jacard_coef: 0.4473 Epoch 158/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0893 - mae_euclidean: 0.8421 - jacard_coef: 0.4765 - val_loss: 0.1602 - val_mae_euclidean: 1.9300 - val_jacard_coef: 0.4036 Epoch 159/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0864 - mae_euclidean: 0.7349 - jacard_coef: 0.4807 - val_loss: 0.1518 - val_mae_euclidean: 2.0217 - val_jacard_coef: 0.4237 Epoch 160/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0815 - mae_euclidean: 0.5918 - jacard_coef: 0.4933 - val_loss: 0.1675 - val_mae_euclidean: 2.6831 - val_jacard_coef: 0.4203 Epoch 161/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0813 - mae_euclidean: 0.6209 - jacard_coef: 0.4975 - val_loss: 0.1567 - val_mae_euclidean: 1.8549 - val_jacard_coef: 0.4335 Epoch 162/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0793 - mae_euclidean: 0.7436 - jacard_coef: 0.5082 - val_loss: 0.1497 - val_mae_euclidean: 1.4273 - val_jacard_coef: 0.4494 Epoch 163/200 4/4 [==============================] - 2s 491ms/step - loss: 0.0801 - mae_euclidean: 0.7219 - jacard_coef: 0.5100 - val_loss: 0.1903 - val_mae_euclidean: 2.8102 - val_jacard_coef: 0.3901 Epoch 164/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0781 - mae_euclidean: 0.5588 - jacard_coef: 0.5153 - val_loss: 0.1428 - val_mae_euclidean: 1.2351 - val_jacard_coef: 0.4615 Epoch 165/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0795 - mae_euclidean: 0.6798 - jacard_coef: 0.5160 - val_loss: 0.1692 - val_mae_euclidean: 1.9808 - val_jacard_coef: 0.4206 Epoch 166/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0768 - mae_euclidean: 0.6211 - jacard_coef: 0.5203 - val_loss: 0.1387 - val_mae_euclidean: 1.3454 - val_jacard_coef: 0.4598 Epoch 167/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0774 - mae_euclidean: 0.7417 - jacard_coef: 0.5209 - val_loss: 0.1715 - val_mae_euclidean: 1.9014 - val_jacard_coef: 0.4210 Epoch 168/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0771 - mae_euclidean: 0.6565 - jacard_coef: 0.5221 - val_loss: 0.1470 - val_mae_euclidean: 1.3695 - val_jacard_coef: 0.4574 Epoch 169/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0775 - mae_euclidean: 0.6851 - jacard_coef: 0.5225 - val_loss: 0.1789 - val_mae_euclidean: 2.2386 - val_jacard_coef: 0.4149 Epoch 170/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0762 - mae_euclidean: 0.7423 - jacard_coef: 0.5266 - val_loss: 0.1476 - val_mae_euclidean: 1.2864 - val_jacard_coef: 0.4600 Epoch 171/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0737 - mae_euclidean: 0.6655 - jacard_coef: 0.5360 - val_loss: 0.1650 - val_mae_euclidean: 1.8278 - val_jacard_coef: 0.4401 Epoch 172/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0787 - mae_euclidean: 0.8843 - jacard_coef: 0.5237 - val_loss: 0.1489 - val_mae_euclidean: 1.3762 - val_jacard_coef: 0.4641 Epoch 173/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0726 - mae_euclidean: 0.8047 - jacard_coef: 0.5434 - val_loss: 0.1349 - val_mae_euclidean: 1.0559 - val_jacard_coef: 0.4794 Epoch 174/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0732 - mae_euclidean: 0.5377 - jacard_coef: 0.5449 - val_loss: 0.1514 - val_mae_euclidean: 1.6726 - val_jacard_coef: 0.4485 Epoch 175/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0722 - mae_euclidean: 0.5893 - jacard_coef: 0.5450 - val_loss: 0.1476 - val_mae_euclidean: 1.3248 - val_jacard_coef: 0.4572 Epoch 176/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0729 - mae_euclidean: 0.7713 - jacard_coef: 0.5423 - val_loss: 0.1691 - val_mae_euclidean: 1.4307 - val_jacard_coef: 0.4321 Epoch 177/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0717 - mae_euclidean: 0.6697 - jacard_coef: 0.5471 - val_loss: 0.1536 - val_mae_euclidean: 1.3854 - val_jacard_coef: 0.4609 Epoch 178/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0692 - mae_euclidean: 0.4895 - jacard_coef: 0.5526 - val_loss: 0.1490 - val_mae_euclidean: 1.3071 - val_jacard_coef: 0.4672 Epoch 179/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0698 - mae_euclidean: 0.5738 - jacard_coef: 0.5605 - val_loss: 0.1597 - val_mae_euclidean: 1.3696 - val_jacard_coef: 0.4620 Epoch 180/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0670 - mae_euclidean: 0.5513 - jacard_coef: 0.5702 - val_loss: 0.1500 - val_mae_euclidean: 1.3538 - val_jacard_coef: 0.4783 Epoch 181/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0669 - mae_euclidean: 0.4366 - jacard_coef: 0.5712 - val_loss: 0.1704 - val_mae_euclidean: 1.3771 - val_jacard_coef: 0.4434 Epoch 182/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0668 - mae_euclidean: 0.5525 - jacard_coef: 0.5719 - val_loss: 0.1666 - val_mae_euclidean: 1.3508 - val_jacard_coef: 0.4642 Epoch 183/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0654 - mae_euclidean: 0.5104 - jacard_coef: 0.5774 - val_loss: 0.1620 - val_mae_euclidean: 1.4071 - val_jacard_coef: 0.4648 Epoch 184/200 4/4 [==============================] - 2s 491ms/step - loss: 0.0638 - mae_euclidean: 0.5257 - jacard_coef: 0.5837 - val_loss: 0.1693 - val_mae_euclidean: 1.7664 - val_jacard_coef: 0.4555 Epoch 185/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0632 - mae_euclidean: 0.4176 - jacard_coef: 0.5900 - val_loss: 0.1572 - val_mae_euclidean: 1.4163 - val_jacard_coef: 0.4690 Epoch 186/200 4/4 [==============================] - 2s 495ms/step - loss: 0.0622 - mae_euclidean: 0.3982 - jacard_coef: 0.5937 - val_loss: 0.1486 - val_mae_euclidean: 1.2644 - val_jacard_coef: 0.4771 Epoch 187/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0650 - mae_euclidean: 0.5350 - jacard_coef: 0.5852 - val_loss: 0.1592 - val_mae_euclidean: 1.2668 - val_jacard_coef: 0.4654 Epoch 188/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0639 - mae_euclidean: 0.4657 - jacard_coef: 0.5911 - val_loss: 0.1580 - val_mae_euclidean: 1.4522 - val_jacard_coef: 0.4576 Epoch 189/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0627 - mae_euclidean: 0.4975 - jacard_coef: 0.5915 - val_loss: 0.1612 - val_mae_euclidean: 1.3866 - val_jacard_coef: 0.4664 Epoch 190/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0602 - mae_euclidean: 0.3724 - jacard_coef: 0.6046 - val_loss: 0.1681 - val_mae_euclidean: 1.3677 - val_jacard_coef: 0.4601 Epoch 191/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0603 - mae_euclidean: 0.5073 - jacard_coef: 0.6044 - val_loss: 0.1590 - val_mae_euclidean: 1.3443 - val_jacard_coef: 0.4758 Epoch 192/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0605 - mae_euclidean: 0.4058 - jacard_coef: 0.6072 - val_loss: 0.1684 - val_mae_euclidean: 1.3877 - val_jacard_coef: 0.4607 Epoch 193/200 4/4 [==============================] - 2s 493ms/step - loss: 0.0601 - mae_euclidean: 0.4223 - jacard_coef: 0.6087 - val_loss: 0.1710 - val_mae_euclidean: 1.6969 - val_jacard_coef: 0.4711 Epoch 194/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0603 - mae_euclidean: 0.5903 - jacard_coef: 0.6048 - val_loss: 0.1562 - val_mae_euclidean: 1.2121 - val_jacard_coef: 0.4814 Epoch 195/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0649 - mae_euclidean: 0.7150 - jacard_coef: 0.5894 - val_loss: 0.1764 - val_mae_euclidean: 1.7122 - val_jacard_coef: 0.4658 Epoch 196/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0649 - mae_euclidean: 0.6689 - jacard_coef: 0.5954 - val_loss: 0.1592 - val_mae_euclidean: 1.1036 - val_jacard_coef: 0.4696 Epoch 197/200 4/4 [==============================] - 2s 492ms/step - loss: 0.0674 - mae_euclidean: 0.6684 - jacard_coef: 0.5772 - val_loss: 0.1819 - val_mae_euclidean: 1.5406 - val_jacard_coef: 0.4522 Epoch 198/200 4/4 [==============================] - 2s 491ms/step - loss: 0.0656 - mae_euclidean: 0.7373 - jacard_coef: 0.5861 - val_loss: 0.1544 - val_mae_euclidean: 1.2252 - val_jacard_coef: 0.4813 Epoch 199/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0620 - mae_euclidean: 0.4680 - jacard_coef: 0.5973 - val_loss: 0.1540 - val_mae_euclidean: 1.2381 - val_jacard_coef: 0.4800 Epoch 200/200 4/4 [==============================] - 2s 494ms/step - loss: 0.0615 - mae_euclidean: 0.6253 - jacard_coef: 0.5991 - val_loss: 0.1613 - val_mae_euclidean: 1.2337 - val_jacard_coef: 0.4869 Model trained for 395.1184892654419s
plot_history(history_sa_unet3, "SA Unet3")
plot_train_val(model_sa_unet3, 1, 1)
The binary cross entropy produced similar MAE euclidean and a smaller jacard loss. Loss is approaching 0, limiting the space for improvement. The risk for overfitting with binary cross entropy is higher.
# finding the optimal Adam seetings
def model_builder_SA_UNet_3(hp):
model = SA_UNet(input_shape, block_size=25, keep_prob=0.8, start_neurons=20)
hp_learning_rate = hp.Float('learning_rate', min_value=0.001, max_value=0.01, step=(0.01-0.001))
hp_beta_1 = hp.Float('beta_1', min_value=0.85, max_value=0.95, step=0.01)
hp_beta_2 = hp.Float('beta_2', min_value=0.9, max_value=0.999, step=0.001)
model.compile(optimizer = Adam(learning_rate = hp_learning_rate, beta_1 = hp_beta_1, beta_2 = hp_beta_2),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef])
return model
tuner3 = kt.BayesianOptimization(model_builder_SA_UNet_3,
objective = kt.Objective("val_mae_euclidean", direction='min'),
max_trials=10,
seed = 14,
directory=os.path.normpath('C:/keras_tuner'),
project_name='2022-02-16_SA_UNet_tune3')
INFO:tensorflow:Reloading Oracle from existing project C:\keras_tuner\2022-02-16_SA_UNet_tune3\oracle.json INFO:tensorflow:Reloading Tuner from C:\keras_tuner\2022-02-16_SA_UNet_tune3\tuner0.json
tf.random.set_seed(14)
tuner3.search(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8,
epochs=200
)
INFO:tensorflow:Oracle triggered exit
tuner3.results_summary(num_trials=10)
Results summary Results in C:\keras_tuner\2022-02-16_SA_UNet_tune3 Showing 10 best trials Objective(name='val_mae_euclidean', direction='min') Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.85 beta_2: 0.919 Score: 0.8949853777885437 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.85 beta_2: 0.9 Score: 0.8990997076034546 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.85 beta_2: 0.9530000000000001 Score: 0.9050353765487671 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.85 beta_2: 0.914 Score: 0.9506241083145142 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.85 beta_2: 0.937 Score: 0.9784936308860779 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.87 beta_2: 0.9670000000000001 Score: 1.0058289766311646 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.85 beta_2: 0.9990000000000001 Score: 1.0201386213302612 Trial summary Hyperparameters: learning_rate: 0.010000000000000002 beta_1: 0.92 beta_2: 0.9 Score: 1.035428762435913 Trial summary Hyperparameters: learning_rate: 0.001 beta_1: 0.85 beta_2: 0.9 Score: 1.091120958328247 Trial summary Hyperparameters: learning_rate: 0.001 beta_1: 0.9500000000000001 beta_2: 0.936 Score: 1.131001353263855
The best parameters differ from the already used, but the benefits are small/negligible. All top settings use beta_1=0.85, which is on the edge of the search space; lower beta_1 might be beneficial, but not be explored in this work. Let's train and look at the curves.
# SA Unet4
# changing the Adam parameters learning_rate=1e-2, beta_1=0.85, beta_2=0.919
tf.random.set_seed(14)
model_sa_unet4 = SA_UNet(input_shape, block_size=25, keep_prob=0.8, start_neurons=20)
model_sa_unet4.compile(optimizer=Adam(learning_rate=1e-2, beta_1=0.85, beta_2=0.919),
loss = dice_coef_loss,
metrics=[mae_euclidean, jacard_coef]
)
start = time.time()
history_sa_unet4 = model_sa_unet4.fit(X_train, y_train,
validation_data = (X_val, y_val),
batch_size = 8, # no resources for 16
epochs=200
)
print(f"Model trained for {time.time() - start}s")
model_sa_unet4.save("2022-02-17 SA-UNet4 200epochs.hdf5")
Epoch 1/200 4/4 [==============================] - 5s 583ms/step - loss: 0.8377 - mae_euclidean: 5.3094 - jacard_coef: 0.0892 - val_loss: 0.8610 - val_mae_euclidean: 5.9715 - val_jacard_coef: 0.0747 Epoch 2/200 4/4 [==============================] - 2s 485ms/step - loss: 0.7669 - mae_euclidean: 3.6656 - jacard_coef: 0.1322 - val_loss: 0.9999 - val_mae_euclidean: 159.4866 - val_jacard_coef: 6.1341e-05 Epoch 3/200 4/4 [==============================] - 2s 485ms/step - loss: 0.7218 - mae_euclidean: 3.2920 - jacard_coef: 0.1616 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.4674e-05 Epoch 4/200 4/4 [==============================] - 2s 485ms/step - loss: 0.6825 - mae_euclidean: 3.0739 - jacard_coef: 0.1888 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6861e-05 Epoch 5/200 4/4 [==============================] - 2s 487ms/step - loss: 0.6443 - mae_euclidean: 2.7700 - jacard_coef: 0.2171 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6861e-05 Epoch 6/200 4/4 [==============================] - 2s 487ms/step - loss: 0.5993 - mae_euclidean: 2.4325 - jacard_coef: 0.2507 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7006e-05 Epoch 7/200 4/4 [==============================] - 2s 489ms/step - loss: 0.5659 - mae_euclidean: 2.0500 - jacard_coef: 0.2781 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.6868e-05 Epoch 8/200 4/4 [==============================] - 2s 488ms/step - loss: 0.5248 - mae_euclidean: 1.9873 - jacard_coef: 0.3121 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8294e-05 Epoch 9/200 4/4 [==============================] - 2s 486ms/step - loss: 0.4926 - mae_euclidean: 1.7374 - jacard_coef: 0.3403 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.2723e-05 Epoch 10/200 4/4 [==============================] - 2s 488ms/step - loss: 0.4605 - mae_euclidean: 1.4598 - jacard_coef: 0.3694 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.1314e-05 Epoch 11/200 4/4 [==============================] - 2s 488ms/step - loss: 0.4427 - mae_euclidean: 1.4173 - jacard_coef: 0.3866 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8128e-05 Epoch 12/200 4/4 [==============================] - 2s 490ms/step - loss: 0.4223 - mae_euclidean: 1.4714 - jacard_coef: 0.4062 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0797e-05 Epoch 13/200 4/4 [==============================] - 2s 492ms/step - loss: 0.4179 - mae_euclidean: 1.5727 - jacard_coef: 0.4107 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0164e-05 Epoch 14/200 4/4 [==============================] - 2s 492ms/step - loss: 0.4186 - mae_euclidean: 1.3531 - jacard_coef: 0.4100 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7651e-05 Epoch 15/200 4/4 [==============================] - 2s 491ms/step - loss: 0.4008 - mae_euclidean: 1.6014 - jacard_coef: 0.4278 - val_loss: 0.9992 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.9767e-04 Epoch 16/200 4/4 [==============================] - 2s 491ms/step - loss: 0.3917 - mae_euclidean: 1.4183 - jacard_coef: 0.4371 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.0905e-05 Epoch 17/200 4/4 [==============================] - 2s 491ms/step - loss: 0.3776 - mae_euclidean: 1.3859 - jacard_coef: 0.4519 - val_loss: 0.9995 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.9004e-04 Epoch 18/200 4/4 [==============================] - 2s 493ms/step - loss: 0.3870 - mae_euclidean: 1.3800 - jacard_coef: 0.4420 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0939e-05 Epoch 19/200 4/4 [==============================] - 2s 491ms/step - loss: 0.3734 - mae_euclidean: 1.6324 - jacard_coef: 0.4564 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.8701e-05 Epoch 20/200 4/4 [==============================] - 2s 493ms/step - loss: 0.3715 - mae_euclidean: 1.2398 - jacard_coef: 0.4584 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1128e-04 Epoch 21/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3626 - mae_euclidean: 1.4462 - jacard_coef: 0.4681 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.7776e-05 Epoch 22/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3514 - mae_euclidean: 1.2008 - jacard_coef: 0.4800 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.0951e-04 Epoch 23/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3707 - mae_euclidean: 1.2854 - jacard_coef: 0.4600 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0930e-05 Epoch 24/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3507 - mae_euclidean: 1.3606 - jacard_coef: 0.4810 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.3299e-05 Epoch 25/200 4/4 [==============================] - 2s 497ms/step - loss: 0.3506 - mae_euclidean: 1.3292 - jacard_coef: 0.4809 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.5006e-05 Epoch 26/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3414 - mae_euclidean: 1.0691 - jacard_coef: 0.4911 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 8.5901e-05 Epoch 27/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3559 - mae_euclidean: 1.3180 - jacard_coef: 0.4757 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.9553e-05 Epoch 28/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3591 - mae_euclidean: 1.6986 - jacard_coef: 0.4731 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.8172e-05 Epoch 29/200 4/4 [==============================] - 2s 493ms/step - loss: 0.3428 - mae_euclidean: 1.1461 - jacard_coef: 0.4895 - val_loss: 0.9995 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 2.4348e-04 Epoch 30/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3461 - mae_euclidean: 1.4201 - jacard_coef: 0.4860 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.0074e-04 Epoch 31/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3347 - mae_euclidean: 1.1866 - jacard_coef: 0.4986 - val_loss: 0.9997 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.6479e-04 Epoch 32/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3403 - mae_euclidean: 1.1343 - jacard_coef: 0.4924 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.1242e-05 Epoch 33/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3357 - mae_euclidean: 1.1377 - jacard_coef: 0.4974 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.7957e-05 Epoch 34/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3341 - mae_euclidean: 1.0978 - jacard_coef: 0.4996 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.6227e-05 Epoch 35/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3528 - mae_euclidean: 1.6263 - jacard_coef: 0.4804 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.9490e-05 Epoch 36/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3228 - mae_euclidean: 0.9696 - jacard_coef: 0.5121 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.2855e-05 Epoch 37/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3252 - mae_euclidean: 1.1013 - jacard_coef: 0.5093 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.0933e-05 Epoch 38/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3236 - mae_euclidean: 1.1611 - jacard_coef: 0.5112 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.2548e-05 Epoch 39/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3142 - mae_euclidean: 1.0212 - jacard_coef: 0.5219 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.2138e-04 Epoch 40/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3227 - mae_euclidean: 1.2094 - jacard_coef: 0.5122 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 7.8660e-05 Epoch 41/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3244 - mae_euclidean: 1.0948 - jacard_coef: 0.5103 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.4871e-05 Epoch 42/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3244 - mae_euclidean: 1.1824 - jacard_coef: 0.5103 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 9.1089e-05 Epoch 43/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3107 - mae_euclidean: 1.0744 - jacard_coef: 0.5261 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.3223e-05 Epoch 44/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3154 - mae_euclidean: 1.0878 - jacard_coef: 0.5210 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.8086e-05 Epoch 45/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3098 - mae_euclidean: 1.1340 - jacard_coef: 0.5271 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8710e-05 Epoch 46/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3293 - mae_euclidean: 1.2037 - jacard_coef: 0.5049 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.1277e-05 Epoch 47/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3102 - mae_euclidean: 1.0855 - jacard_coef: 0.5265 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.6131e-05 Epoch 48/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3092 - mae_euclidean: 1.1403 - jacard_coef: 0.5279 - val_loss: 0.9998 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 1.1472e-04 Epoch 49/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3078 - mae_euclidean: 1.0475 - jacard_coef: 0.5294 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 6.7359e-05 Epoch 50/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3128 - mae_euclidean: 1.0083 - jacard_coef: 0.5235 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.9633e-05 Epoch 51/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3107 - mae_euclidean: 1.0558 - jacard_coef: 0.5259 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.3958e-05 Epoch 52/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3069 - mae_euclidean: 1.0413 - jacard_coef: 0.5304 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.2049e-05 Epoch 53/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3023 - mae_euclidean: 1.0936 - jacard_coef: 0.5358 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.1801e-05 Epoch 54/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2981 - mae_euclidean: 0.9954 - jacard_coef: 0.5408 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 5.0485e-05 Epoch 55/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2992 - mae_euclidean: 1.1653 - jacard_coef: 0.5396 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.9500e-05 Epoch 56/200 4/4 [==============================] - 2s 493ms/step - loss: 0.3019 - mae_euclidean: 0.9627 - jacard_coef: 0.5362 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.8101e-05 Epoch 57/200 4/4 [==============================] - 2s 494ms/step - loss: 0.3042 - mae_euclidean: 1.0287 - jacard_coef: 0.5336 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 3.7714e-05 Epoch 58/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2997 - mae_euclidean: 1.0451 - jacard_coef: 0.5390 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.0184e-05 Epoch 59/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3030 - mae_euclidean: 1.1888 - jacard_coef: 0.5350 - val_loss: 1.0000 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.2897e-05 Epoch 60/200 4/4 [==============================] - 2s 498ms/step - loss: 0.3156 - mae_euclidean: 1.0978 - jacard_coef: 0.5205 - val_loss: 0.9996 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 2.2973e-04 Epoch 61/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2965 - mae_euclidean: 1.1231 - jacard_coef: 0.5427 - val_loss: 0.9999 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 8.2164e-05 Epoch 62/200 4/4 [==============================] - 2s 496ms/step - loss: 0.3040 - mae_euclidean: 1.0573 - jacard_coef: 0.5338 - val_loss: 0.9995 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 2.7040e-04 Epoch 63/200 4/4 [==============================] - 2s 495ms/step - loss: 0.3079 - mae_euclidean: 1.0688 - jacard_coef: 0.5292 - val_loss: 0.9995 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 2.5387e-04 Epoch 64/200 4/4 [==============================] - 2s 497ms/step - loss: 0.3097 - mae_euclidean: 1.1282 - jacard_coef: 0.5275 - val_loss: 0.9999 - val_mae_euclidean: 18446742974197923840.0000 - val_jacard_coef: 4.6745e-05 Epoch 65/200 4/4 [==============================] - 2s 497ms/step - loss: 0.3007 - mae_euclidean: 1.0855 - jacard_coef: 0.5377 - val_loss: 0.9996 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 2.3382e-04 Epoch 66/200 4/4 [==============================] - 2s 497ms/step - loss: 0.3102 - mae_euclidean: 1.1044 - jacard_coef: 0.5266 - val_loss: 0.9991 - val_mae_euclidean: 13835056955770535936.0000 - val_jacard_coef: 4.8740e-04 Epoch 67/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2979 - mae_euclidean: 1.0645 - jacard_coef: 0.5410 - val_loss: 0.9934 - val_mae_euclidean: 57.3900 - val_jacard_coef: 0.0033 Epoch 68/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2996 - mae_euclidean: 0.9964 - jacard_coef: 0.5390 - val_loss: 0.9986 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 7.3553e-04 Epoch 69/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2949 - mae_euclidean: 1.0318 - jacard_coef: 0.5448 - val_loss: 0.9995 - val_mae_euclidean: 11529213946556841984.0000 - val_jacard_coef: 2.5823e-04 Epoch 70/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2841 - mae_euclidean: 0.9335 - jacard_coef: 0.5576 - val_loss: 0.9998 - val_mae_euclidean: 16140899964984229888.0000 - val_jacard_coef: 1.4012e-04 Epoch 71/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2908 - mae_euclidean: 1.1867 - jacard_coef: 0.5497 - val_loss: 0.9983 - val_mae_euclidean: 4611685743549480960.0000 - val_jacard_coef: 8.7475e-04 Epoch 72/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2894 - mae_euclidean: 1.0929 - jacard_coef: 0.5514 - val_loss: 0.9957 - val_mae_euclidean: 75.8855 - val_jacard_coef: 0.0021 Epoch 73/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2873 - mae_euclidean: 0.9137 - jacard_coef: 0.5537 - val_loss: 0.9979 - val_mae_euclidean: 2305842871774740480.0000 - val_jacard_coef: 0.0011 Epoch 74/200 4/4 [==============================] - 2s 498ms/step - loss: 0.3012 - mae_euclidean: 1.3619 - jacard_coef: 0.5375 - val_loss: 0.9974 - val_mae_euclidean: 69.2515 - val_jacard_coef: 0.0013 Epoch 75/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2835 - mae_euclidean: 0.9210 - jacard_coef: 0.5583 - val_loss: 0.9957 - val_mae_euclidean: 59.7613 - val_jacard_coef: 0.0022 Epoch 76/200 4/4 [==============================] - 2s 500ms/step - loss: 0.2863 - mae_euclidean: 0.9944 - jacard_coef: 0.5551 - val_loss: 0.9978 - val_mae_euclidean: 66.0968 - val_jacard_coef: 0.0011 Epoch 77/200 4/4 [==============================] - 2s 500ms/step - loss: 0.2925 - mae_euclidean: 1.1455 - jacard_coef: 0.5482 - val_loss: 0.9890 - val_mae_euclidean: 34.2576 - val_jacard_coef: 0.0055 Epoch 78/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2917 - mae_euclidean: 1.0450 - jacard_coef: 0.5485 - val_loss: 0.9910 - val_mae_euclidean: 38.9680 - val_jacard_coef: 0.0046 Epoch 79/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2833 - mae_euclidean: 0.9998 - jacard_coef: 0.5586 - val_loss: 0.9795 - val_mae_euclidean: 26.8988 - val_jacard_coef: 0.0104 Epoch 80/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2892 - mae_euclidean: 1.0365 - jacard_coef: 0.5518 - val_loss: 0.9840 - val_mae_euclidean: 27.9931 - val_jacard_coef: 0.0081 Epoch 81/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2906 - mae_euclidean: 0.9792 - jacard_coef: 0.5500 - val_loss: 0.9926 - val_mae_euclidean: 45.6298 - val_jacard_coef: 0.0037 Epoch 82/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2817 - mae_euclidean: 1.0207 - jacard_coef: 0.5606 - val_loss: 0.9458 - val_mae_euclidean: 14.7199 - val_jacard_coef: 0.0279 Epoch 83/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2847 - mae_euclidean: 0.8814 - jacard_coef: 0.5569 - val_loss: 0.9847 - val_mae_euclidean: 42.5552 - val_jacard_coef: 0.0077 Epoch 84/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2806 - mae_euclidean: 0.9501 - jacard_coef: 0.5622 - val_loss: 0.9359 - val_mae_euclidean: 14.3443 - val_jacard_coef: 0.0331 Epoch 85/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2827 - mae_euclidean: 0.8633 - jacard_coef: 0.5592 - val_loss: 0.9501 - val_mae_euclidean: 13.9297 - val_jacard_coef: 0.0256 Epoch 86/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2842 - mae_euclidean: 1.1018 - jacard_coef: 0.5576 - val_loss: 0.9248 - val_mae_euclidean: 12.5298 - val_jacard_coef: 0.0391 Epoch 87/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2795 - mae_euclidean: 0.9200 - jacard_coef: 0.5631 - val_loss: 0.8760 - val_mae_euclidean: 10.9944 - val_jacard_coef: 0.0661 Epoch 88/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2734 - mae_euclidean: 0.9415 - jacard_coef: 0.5707 - val_loss: 0.9091 - val_mae_euclidean: 12.3863 - val_jacard_coef: 0.0476 Epoch 89/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2794 - mae_euclidean: 1.1333 - jacard_coef: 0.5638 - val_loss: 0.9015 - val_mae_euclidean: 12.4140 - val_jacard_coef: 0.0518 Epoch 90/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2768 - mae_euclidean: 0.9714 - jacard_coef: 0.5665 - val_loss: 0.8791 - val_mae_euclidean: 11.2671 - val_jacard_coef: 0.0644 Epoch 91/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2754 - mae_euclidean: 1.0188 - jacard_coef: 0.5681 - val_loss: 0.9258 - val_mae_euclidean: 13.1303 - val_jacard_coef: 0.0386 Epoch 92/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2708 - mae_euclidean: 0.8824 - jacard_coef: 0.5738 - val_loss: 0.8420 - val_mae_euclidean: 8.8745 - val_jacard_coef: 0.0858 Epoch 93/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2784 - mae_euclidean: 1.0412 - jacard_coef: 0.5647 - val_loss: 0.8792 - val_mae_euclidean: 10.5468 - val_jacard_coef: 0.0643 Epoch 94/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2731 - mae_euclidean: 0.9969 - jacard_coef: 0.5714 - val_loss: 0.8961 - val_mae_euclidean: 11.4023 - val_jacard_coef: 0.0548 Epoch 95/200 4/4 [==============================] - 2s 499ms/step - loss: 0.3032 - mae_euclidean: 1.1734 - jacard_coef: 0.5363 - val_loss: 0.7952 - val_mae_euclidean: 7.8866 - val_jacard_coef: 0.1141 Epoch 96/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2749 - mae_euclidean: 1.1841 - jacard_coef: 0.5688 - val_loss: 0.8942 - val_mae_euclidean: 10.9484 - val_jacard_coef: 0.0559 Epoch 97/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2744 - mae_euclidean: 0.9335 - jacard_coef: 0.5697 - val_loss: 0.7986 - val_mae_euclidean: 7.1317 - val_jacard_coef: 0.1120 Epoch 98/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2732 - mae_euclidean: 0.9313 - jacard_coef: 0.5710 - val_loss: 0.7639 - val_mae_euclidean: 6.5631 - val_jacard_coef: 0.1338 Epoch 99/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2712 - mae_euclidean: 1.0767 - jacard_coef: 0.5734 - val_loss: 0.6915 - val_mae_euclidean: 5.7044 - val_jacard_coef: 0.1824 Epoch 100/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2678 - mae_euclidean: 0.9197 - jacard_coef: 0.5776 - val_loss: 0.6286 - val_mae_euclidean: 5.0102 - val_jacard_coef: 0.2280 Epoch 101/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2740 - mae_euclidean: 1.2898 - jacard_coef: 0.5701 - val_loss: 0.6937 - val_mae_euclidean: 5.5913 - val_jacard_coef: 0.1809 Epoch 102/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2721 - mae_euclidean: 0.9318 - jacard_coef: 0.5724 - val_loss: 0.7198 - val_mae_euclidean: 6.2189 - val_jacard_coef: 0.1629 Epoch 103/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2656 - mae_euclidean: 0.9520 - jacard_coef: 0.5803 - val_loss: 0.6502 - val_mae_euclidean: 5.4189 - val_jacard_coef: 0.2120 Epoch 104/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2679 - mae_euclidean: 0.9776 - jacard_coef: 0.5777 - val_loss: 0.6805 - val_mae_euclidean: 5.5152 - val_jacard_coef: 0.1901 Epoch 105/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2681 - mae_euclidean: 1.0188 - jacard_coef: 0.5773 - val_loss: 0.6197 - val_mae_euclidean: 4.9735 - val_jacard_coef: 0.2348 Epoch 106/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2680 - mae_euclidean: 0.9962 - jacard_coef: 0.5773 - val_loss: 0.6184 - val_mae_euclidean: 4.2533 - val_jacard_coef: 0.2358 Epoch 107/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2652 - mae_euclidean: 1.0471 - jacard_coef: 0.5808 - val_loss: 0.6078 - val_mae_euclidean: 4.1540 - val_jacard_coef: 0.2440 Epoch 108/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2669 - mae_euclidean: 0.9487 - jacard_coef: 0.5788 - val_loss: 0.5074 - val_mae_euclidean: 3.4403 - val_jacard_coef: 0.3268 Epoch 109/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2759 - mae_euclidean: 1.0128 - jacard_coef: 0.5677 - val_loss: 0.4948 - val_mae_euclidean: 3.3481 - val_jacard_coef: 0.3380 Epoch 110/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2641 - mae_euclidean: 0.8675 - jacard_coef: 0.5824 - val_loss: 0.6221 - val_mae_euclidean: 4.7253 - val_jacard_coef: 0.2330 Epoch 111/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2579 - mae_euclidean: 0.9825 - jacard_coef: 0.5901 - val_loss: 0.5133 - val_mae_euclidean: 3.7012 - val_jacard_coef: 0.3217 Epoch 112/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2696 - mae_euclidean: 0.9456 - jacard_coef: 0.5761 - val_loss: 0.5141 - val_mae_euclidean: 3.3663 - val_jacard_coef: 0.3209 Epoch 113/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2614 - mae_euclidean: 1.0951 - jacard_coef: 0.5858 - val_loss: 0.5284 - val_mae_euclidean: 3.5039 - val_jacard_coef: 0.3086 Epoch 114/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2621 - mae_euclidean: 1.1057 - jacard_coef: 0.5851 - val_loss: 0.6109 - val_mae_euclidean: 4.4810 - val_jacard_coef: 0.2416 Epoch 115/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2586 - mae_euclidean: 0.8650 - jacard_coef: 0.5892 - val_loss: 0.5194 - val_mae_euclidean: 3.5282 - val_jacard_coef: 0.3163 Epoch 116/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2557 - mae_euclidean: 0.8813 - jacard_coef: 0.5930 - val_loss: 0.4765 - val_mae_euclidean: 3.2293 - val_jacard_coef: 0.3546 Epoch 117/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2591 - mae_euclidean: 0.8282 - jacard_coef: 0.5884 - val_loss: 0.5298 - val_mae_euclidean: 3.6721 - val_jacard_coef: 0.3074 Epoch 118/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2505 - mae_euclidean: 0.9443 - jacard_coef: 0.5995 - val_loss: 0.5121 - val_mae_euclidean: 3.6178 - val_jacard_coef: 0.3227 Epoch 119/200 4/4 [==============================] - 2s 494ms/step - loss: 0.2557 - mae_euclidean: 0.8527 - jacard_coef: 0.5928 - val_loss: 0.4868 - val_mae_euclidean: 3.3195 - val_jacard_coef: 0.3452 Epoch 120/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2495 - mae_euclidean: 0.9600 - jacard_coef: 0.6007 - val_loss: 0.4676 - val_mae_euclidean: 3.1279 - val_jacard_coef: 0.3628 Epoch 121/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2502 - mae_euclidean: 0.8585 - jacard_coef: 0.5998 - val_loss: 0.4949 - val_mae_euclidean: 3.1549 - val_jacard_coef: 0.3379 Epoch 122/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2577 - mae_euclidean: 0.9360 - jacard_coef: 0.5905 - val_loss: 0.4818 - val_mae_euclidean: 3.1001 - val_jacard_coef: 0.3497 Epoch 123/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2653 - mae_euclidean: 0.9938 - jacard_coef: 0.5818 - val_loss: 0.4603 - val_mae_euclidean: 2.9832 - val_jacard_coef: 0.3696 Epoch 124/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2529 - mae_euclidean: 0.8735 - jacard_coef: 0.5964 - val_loss: 0.4400 - val_mae_euclidean: 2.9826 - val_jacard_coef: 0.3889 Epoch 125/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2513 - mae_euclidean: 0.8053 - jacard_coef: 0.5988 - val_loss: 0.4023 - val_mae_euclidean: 2.6209 - val_jacard_coef: 0.4262 Epoch 126/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2604 - mae_euclidean: 0.9681 - jacard_coef: 0.5872 - val_loss: 0.4289 - val_mae_euclidean: 2.9010 - val_jacard_coef: 0.3997 Epoch 127/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2527 - mae_euclidean: 0.8364 - jacard_coef: 0.5967 - val_loss: 0.4859 - val_mae_euclidean: 3.2361 - val_jacard_coef: 0.3460 Epoch 128/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2527 - mae_euclidean: 1.0828 - jacard_coef: 0.5966 - val_loss: 0.3687 - val_mae_euclidean: 2.1401 - val_jacard_coef: 0.4613 Epoch 129/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2481 - mae_euclidean: 0.8220 - jacard_coef: 0.6028 - val_loss: 0.3499 - val_mae_euclidean: 2.4979 - val_jacard_coef: 0.4816 Epoch 130/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2488 - mae_euclidean: 0.7930 - jacard_coef: 0.6016 - val_loss: 0.3599 - val_mae_euclidean: 2.8534 - val_jacard_coef: 0.4707 Epoch 131/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2437 - mae_euclidean: 0.8713 - jacard_coef: 0.6081 - val_loss: 0.3942 - val_mae_euclidean: 2.8003 - val_jacard_coef: 0.4345 Epoch 132/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2609 - mae_euclidean: 0.9183 - jacard_coef: 0.5866 - val_loss: 0.3694 - val_mae_euclidean: 2.1826 - val_jacard_coef: 0.4605 Epoch 133/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2430 - mae_euclidean: 0.8960 - jacard_coef: 0.6091 - val_loss: 0.4081 - val_mae_euclidean: 2.7651 - val_jacard_coef: 0.4203 Epoch 134/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2428 - mae_euclidean: 0.7323 - jacard_coef: 0.6095 - val_loss: 0.4109 - val_mae_euclidean: 2.3370 - val_jacard_coef: 0.4176 Epoch 135/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2504 - mae_euclidean: 0.9053 - jacard_coef: 0.5998 - val_loss: 0.4249 - val_mae_euclidean: 2.9121 - val_jacard_coef: 0.4036 Epoch 136/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2446 - mae_euclidean: 0.8340 - jacard_coef: 0.6071 - val_loss: 0.3470 - val_mae_euclidean: 2.1365 - val_jacard_coef: 0.4848 Epoch 137/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2448 - mae_euclidean: 0.9199 - jacard_coef: 0.6070 - val_loss: 0.3535 - val_mae_euclidean: 2.5955 - val_jacard_coef: 0.4777 Epoch 138/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2488 - mae_euclidean: 0.9950 - jacard_coef: 0.6023 - val_loss: 0.3984 - val_mae_euclidean: 2.2473 - val_jacard_coef: 0.4302 Epoch 139/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2380 - mae_euclidean: 0.7262 - jacard_coef: 0.6156 - val_loss: 0.4623 - val_mae_euclidean: 2.9378 - val_jacard_coef: 0.3677 Epoch 140/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2375 - mae_euclidean: 0.8277 - jacard_coef: 0.6162 - val_loss: 0.3528 - val_mae_euclidean: 1.6216 - val_jacard_coef: 0.4784 Epoch 141/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2488 - mae_euclidean: 1.0794 - jacard_coef: 0.6032 - val_loss: 0.3917 - val_mae_euclidean: 2.3767 - val_jacard_coef: 0.4371 Epoch 142/200 4/4 [==============================] - 2s 500ms/step - loss: 0.2415 - mae_euclidean: 0.8987 - jacard_coef: 0.6110 - val_loss: 0.3632 - val_mae_euclidean: 2.0141 - val_jacard_coef: 0.4672 Epoch 143/200 4/4 [==============================] - 2s 501ms/step - loss: 0.2371 - mae_euclidean: 0.7727 - jacard_coef: 0.6168 - val_loss: 0.3148 - val_mae_euclidean: 1.8348 - val_jacard_coef: 0.5212 Epoch 144/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2358 - mae_euclidean: 0.7359 - jacard_coef: 0.6185 - val_loss: 0.3702 - val_mae_euclidean: 1.6764 - val_jacard_coef: 0.4597 Epoch 145/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2389 - mae_euclidean: 0.8337 - jacard_coef: 0.6144 - val_loss: 0.3598 - val_mae_euclidean: 1.5510 - val_jacard_coef: 0.4709 Epoch 146/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2365 - mae_euclidean: 0.7965 - jacard_coef: 0.6176 - val_loss: 0.3450 - val_mae_euclidean: 1.8329 - val_jacard_coef: 0.4870 Epoch 147/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2356 - mae_euclidean: 0.8028 - jacard_coef: 0.6187 - val_loss: 0.3122 - val_mae_euclidean: 1.8314 - val_jacard_coef: 0.5241 Epoch 148/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2330 - mae_euclidean: 0.7864 - jacard_coef: 0.6221 - val_loss: 0.3535 - val_mae_euclidean: 1.6760 - val_jacard_coef: 0.4777 Epoch 149/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2425 - mae_euclidean: 0.7589 - jacard_coef: 0.6098 - val_loss: 0.3402 - val_mae_euclidean: 1.5643 - val_jacard_coef: 0.4923 Epoch 150/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2350 - mae_euclidean: 1.0400 - jacard_coef: 0.6198 - val_loss: 0.3057 - val_mae_euclidean: 1.4581 - val_jacard_coef: 0.5318 Epoch 151/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2282 - mae_euclidean: 0.7669 - jacard_coef: 0.6285 - val_loss: 0.3404 - val_mae_euclidean: 1.8734 - val_jacard_coef: 0.4921 Epoch 152/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2308 - mae_euclidean: 0.8181 - jacard_coef: 0.6252 - val_loss: 0.3233 - val_mae_euclidean: 1.5433 - val_jacard_coef: 0.5114 Epoch 153/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2287 - mae_euclidean: 1.0756 - jacard_coef: 0.6278 - val_loss: 0.3565 - val_mae_euclidean: 2.0546 - val_jacard_coef: 0.4744 Epoch 154/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2306 - mae_euclidean: 0.7820 - jacard_coef: 0.6252 - val_loss: 0.3081 - val_mae_euclidean: 1.2941 - val_jacard_coef: 0.5289 Epoch 155/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2321 - mae_euclidean: 0.8691 - jacard_coef: 0.6234 - val_loss: 0.3444 - val_mae_euclidean: 1.8001 - val_jacard_coef: 0.4876 Epoch 156/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2307 - mae_euclidean: 0.7540 - jacard_coef: 0.6252 - val_loss: 0.3634 - val_mae_euclidean: 2.5841 - val_jacard_coef: 0.4669 Epoch 157/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2243 - mae_euclidean: 0.7166 - jacard_coef: 0.6337 - val_loss: 0.3242 - val_mae_euclidean: 1.2267 - val_jacard_coef: 0.5103 Epoch 158/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2351 - mae_euclidean: 0.8453 - jacard_coef: 0.6194 - val_loss: 0.3274 - val_mae_euclidean: 1.2356 - val_jacard_coef: 0.5067 Epoch 159/200 4/4 [==============================] - 2s 500ms/step - loss: 0.2207 - mae_euclidean: 0.7114 - jacard_coef: 0.6385 - val_loss: 0.3646 - val_mae_euclidean: 2.0152 - val_jacard_coef: 0.4656 Epoch 160/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2199 - mae_euclidean: 0.8186 - jacard_coef: 0.6396 - val_loss: 0.3124 - val_mae_euclidean: 1.2556 - val_jacard_coef: 0.5239 Epoch 161/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2248 - mae_euclidean: 0.7767 - jacard_coef: 0.6329 - val_loss: 0.3606 - val_mae_euclidean: 1.4555 - val_jacard_coef: 0.4699 Epoch 162/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2193 - mae_euclidean: 0.8323 - jacard_coef: 0.6405 - val_loss: 0.3475 - val_mae_euclidean: 1.3706 - val_jacard_coef: 0.4842 Epoch 163/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2190 - mae_euclidean: 0.8373 - jacard_coef: 0.6407 - val_loss: 0.3888 - val_mae_euclidean: 2.5125 - val_jacard_coef: 0.4401 Epoch 164/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2191 - mae_euclidean: 0.7207 - jacard_coef: 0.6406 - val_loss: 0.3541 - val_mae_euclidean: 1.6162 - val_jacard_coef: 0.4770 Epoch 165/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2225 - mae_euclidean: 0.7625 - jacard_coef: 0.6361 - val_loss: 0.3484 - val_mae_euclidean: 1.5998 - val_jacard_coef: 0.4833 Epoch 166/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2195 - mae_euclidean: 0.8200 - jacard_coef: 0.6401 - val_loss: 0.3270 - val_mae_euclidean: 1.2051 - val_jacard_coef: 0.5072 Epoch 167/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2224 - mae_euclidean: 0.7548 - jacard_coef: 0.6363 - val_loss: 0.3820 - val_mae_euclidean: 1.8025 - val_jacard_coef: 0.4472 Epoch 168/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2229 - mae_euclidean: 0.9876 - jacard_coef: 0.6357 - val_loss: 0.3357 - val_mae_euclidean: 1.4488 - val_jacard_coef: 0.4973 Epoch 169/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2185 - mae_euclidean: 0.6977 - jacard_coef: 0.6414 - val_loss: 0.3557 - val_mae_euclidean: 1.6661 - val_jacard_coef: 0.4753 Epoch 170/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2170 - mae_euclidean: 0.7652 - jacard_coef: 0.6434 - val_loss: 0.3169 - val_mae_euclidean: 1.1319 - val_jacard_coef: 0.5187 Epoch 171/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2178 - mae_euclidean: 0.8830 - jacard_coef: 0.6424 - val_loss: 0.3233 - val_mae_euclidean: 1.1845 - val_jacard_coef: 0.5114 Epoch 172/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2268 - mae_euclidean: 0.8841 - jacard_coef: 0.6304 - val_loss: 0.3164 - val_mae_euclidean: 1.6001 - val_jacard_coef: 0.5193 Epoch 173/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2136 - mae_euclidean: 0.8197 - jacard_coef: 0.6480 - val_loss: 0.3020 - val_mae_euclidean: 1.0686 - val_jacard_coef: 0.5361 Epoch 174/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2119 - mae_euclidean: 0.6465 - jacard_coef: 0.6504 - val_loss: 0.3141 - val_mae_euclidean: 1.2279 - val_jacard_coef: 0.5220 Epoch 175/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2101 - mae_euclidean: 0.7610 - jacard_coef: 0.6529 - val_loss: 0.3173 - val_mae_euclidean: 1.4909 - val_jacard_coef: 0.5182 Epoch 176/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2212 - mae_euclidean: 0.9298 - jacard_coef: 0.6383 - val_loss: 0.3275 - val_mae_euclidean: 1.4328 - val_jacard_coef: 0.5065 Epoch 177/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2195 - mae_euclidean: 0.7265 - jacard_coef: 0.6401 - val_loss: 0.3129 - val_mae_euclidean: 1.1077 - val_jacard_coef: 0.5234 Epoch 178/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2152 - mae_euclidean: 1.0275 - jacard_coef: 0.6459 - val_loss: 0.3369 - val_mae_euclidean: 1.5899 - val_jacard_coef: 0.4960 Epoch 179/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2118 - mae_euclidean: 0.6256 - jacard_coef: 0.6506 - val_loss: 0.3244 - val_mae_euclidean: 1.2252 - val_jacard_coef: 0.5101 Epoch 180/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2081 - mae_euclidean: 0.6659 - jacard_coef: 0.6557 - val_loss: 0.3260 - val_mae_euclidean: 1.2070 - val_jacard_coef: 0.5083 Epoch 181/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2106 - mae_euclidean: 0.7603 - jacard_coef: 0.6523 - val_loss: 0.3543 - val_mae_euclidean: 1.4435 - val_jacard_coef: 0.4768 Epoch 182/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2128 - mae_euclidean: 0.7870 - jacard_coef: 0.6493 - val_loss: 0.3213 - val_mae_euclidean: 1.4152 - val_jacard_coef: 0.5137 Epoch 183/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2111 - mae_euclidean: 0.7520 - jacard_coef: 0.6516 - val_loss: 0.3184 - val_mae_euclidean: 1.2593 - val_jacard_coef: 0.5170 Epoch 184/200 4/4 [==============================] - 2s 495ms/step - loss: 0.2044 - mae_euclidean: 0.6469 - jacard_coef: 0.6608 - val_loss: 0.3205 - val_mae_euclidean: 1.2920 - val_jacard_coef: 0.5146 Epoch 185/200 4/4 [==============================] - 2s 494ms/step - loss: 0.2001 - mae_euclidean: 0.6417 - jacard_coef: 0.6666 - val_loss: 0.3036 - val_mae_euclidean: 1.1428 - val_jacard_coef: 0.5342 Epoch 186/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2047 - mae_euclidean: 0.7207 - jacard_coef: 0.6602 - val_loss: 0.3043 - val_mae_euclidean: 1.0703 - val_jacard_coef: 0.5334 Epoch 187/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2076 - mae_euclidean: 0.8643 - jacard_coef: 0.6562 - val_loss: 0.3168 - val_mae_euclidean: 1.5778 - val_jacard_coef: 0.5188 Epoch 188/200 4/4 [==============================] - 2s 496ms/step - loss: 0.2076 - mae_euclidean: 0.7843 - jacard_coef: 0.6563 - val_loss: 0.3134 - val_mae_euclidean: 1.0872 - val_jacard_coef: 0.5227 Epoch 189/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2041 - mae_euclidean: 0.7682 - jacard_coef: 0.6611 - val_loss: 0.3294 - val_mae_euclidean: 1.5466 - val_jacard_coef: 0.5045 Epoch 190/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2012 - mae_euclidean: 0.6552 - jacard_coef: 0.6651 - val_loss: 0.3236 - val_mae_euclidean: 1.2748 - val_jacard_coef: 0.5111 Epoch 191/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1980 - mae_euclidean: 0.6610 - jacard_coef: 0.6695 - val_loss: 0.3285 - val_mae_euclidean: 1.8237 - val_jacard_coef: 0.5054 Epoch 192/200 4/4 [==============================] - 2s 497ms/step - loss: 0.2021 - mae_euclidean: 0.7040 - jacard_coef: 0.6638 - val_loss: 0.3007 - val_mae_euclidean: 0.9230 - val_jacard_coef: 0.5377 Epoch 193/200 4/4 [==============================] - 2s 497ms/step - loss: 0.1982 - mae_euclidean: 0.6551 - jacard_coef: 0.6693 - val_loss: 0.3215 - val_mae_euclidean: 1.8457 - val_jacard_coef: 0.5135 Epoch 194/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2041 - mae_euclidean: 0.9034 - jacard_coef: 0.6612 - val_loss: 0.3365 - val_mae_euclidean: 1.3517 - val_jacard_coef: 0.4964 Epoch 195/200 4/4 [==============================] - 2s 500ms/step - loss: 0.2215 - mae_euclidean: 1.0510 - jacard_coef: 0.6376 - val_loss: 0.3180 - val_mae_euclidean: 1.0438 - val_jacard_coef: 0.5174 Epoch 196/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2222 - mae_euclidean: 1.1582 - jacard_coef: 0.6367 - val_loss: 0.3129 - val_mae_euclidean: 1.2946 - val_jacard_coef: 0.5234 Epoch 197/200 4/4 [==============================] - 2s 498ms/step - loss: 0.2177 - mae_euclidean: 0.9232 - jacard_coef: 0.6426 - val_loss: 0.3541 - val_mae_euclidean: 1.7537 - val_jacard_coef: 0.4771 Epoch 198/200 4/4 [==============================] - 2s 499ms/step - loss: 0.2095 - mae_euclidean: 0.9016 - jacard_coef: 0.6538 - val_loss: 0.3921 - val_mae_euclidean: 3.0633 - val_jacard_coef: 0.4367 Epoch 199/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1990 - mae_euclidean: 0.6883 - jacard_coef: 0.6681 - val_loss: 0.3399 - val_mae_euclidean: 1.8824 - val_jacard_coef: 0.4926 Epoch 200/200 4/4 [==============================] - 2s 498ms/step - loss: 0.1982 - mae_euclidean: 0.7672 - jacard_coef: 0.6692 - val_loss: 0.3216 - val_mae_euclidean: 1.0686 - val_jacard_coef: 0.5134 Model trained for 395.8336317539215s
plot_history(history_sa_unet4, "SA Unet4")
Comparing the graphs from model_sa_unet4 and model_sa_unet2 gives indication for the effect of different beta_1 and beta_2. All the differences are within the noise of the curves. It would be preferred to use the default beta_1 and beta_2.
The model model_sa_unet1 will be accepted as final. Motivation:
Key implementation details:
The trained model can be loaded from the file "2022-02-17 SA-UNet1 200epochs.hdf5". The custom metrics have to be specified during loading.
# best_model = model_sa_unet1
best_model = tf.keras.models.load_model('2022-02-17 SA-UNet1 200epochs.hdf5',
custom_objects={'dice_coef_loss': dice_coef_loss,
'jacard_coef': jacard_coef,
'mae_euclidean': mae_euclidean,
'DropBlock2D' : DropBlock2D}
)
best_model.summary()
Model: "SA_UNet"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 176, 256, 1 0 []
)]
conv2d (Conv2D) (None, 176, 256, 20 200 ['input_1[0][0]']
)
drop_block2d (DropBlock2D) (None, 176, 256, 20 0 ['conv2d[0][0]']
)
batch_normalization (BatchNorm (None, 176, 256, 20 80 ['drop_block2d[0][0]']
alization) )
activation (Activation) (None, 176, 256, 20 0 ['batch_normalization[0][0]']
)
conv2d_1 (Conv2D) (None, 176, 256, 20 3620 ['activation[0][0]']
)
drop_block2d_1 (DropBlock2D) (None, 176, 256, 20 0 ['conv2d_1[0][0]']
)
batch_normalization_1 (BatchNo (None, 176, 256, 20 80 ['drop_block2d_1[0][0]']
rmalization) )
activation_1 (Activation) (None, 176, 256, 20 0 ['batch_normalization_1[0][0]']
)
max_pooling2d (MaxPooling2D) (None, 88, 128, 20) 0 ['activation_1[0][0]']
conv2d_2 (Conv2D) (None, 88, 128, 40) 7240 ['max_pooling2d[0][0]']
drop_block2d_2 (DropBlock2D) (None, 88, 128, 40) 0 ['conv2d_2[0][0]']
batch_normalization_2 (BatchNo (None, 88, 128, 40) 160 ['drop_block2d_2[0][0]']
rmalization)
activation_2 (Activation) (None, 88, 128, 40) 0 ['batch_normalization_2[0][0]']
conv2d_3 (Conv2D) (None, 88, 128, 40) 14440 ['activation_2[0][0]']
drop_block2d_3 (DropBlock2D) (None, 88, 128, 40) 0 ['conv2d_3[0][0]']
batch_normalization_3 (BatchNo (None, 88, 128, 40) 160 ['drop_block2d_3[0][0]']
rmalization)
activation_3 (Activation) (None, 88, 128, 40) 0 ['batch_normalization_3[0][0]']
max_pooling2d_1 (MaxPooling2D) (None, 44, 64, 40) 0 ['activation_3[0][0]']
conv2d_4 (Conv2D) (None, 44, 64, 80) 28880 ['max_pooling2d_1[0][0]']
drop_block2d_4 (DropBlock2D) (None, 44, 64, 80) 0 ['conv2d_4[0][0]']
batch_normalization_4 (BatchNo (None, 44, 64, 80) 320 ['drop_block2d_4[0][0]']
rmalization)
activation_4 (Activation) (None, 44, 64, 80) 0 ['batch_normalization_4[0][0]']
conv2d_5 (Conv2D) (None, 44, 64, 80) 57680 ['activation_4[0][0]']
drop_block2d_5 (DropBlock2D) (None, 44, 64, 80) 0 ['conv2d_5[0][0]']
batch_normalization_5 (BatchNo (None, 44, 64, 80) 320 ['drop_block2d_5[0][0]']
rmalization)
activation_5 (Activation) (None, 44, 64, 80) 0 ['batch_normalization_5[0][0]']
max_pooling2d_2 (MaxPooling2D) (None, 22, 32, 80) 0 ['activation_5[0][0]']
conv2d_6 (Conv2D) (None, 22, 32, 160) 115360 ['max_pooling2d_2[0][0]']
drop_block2d_6 (DropBlock2D) (None, 22, 32, 160) 0 ['conv2d_6[0][0]']
batch_normalization_6 (BatchNo (None, 22, 32, 160) 640 ['drop_block2d_6[0][0]']
rmalization)
activation_6 (Activation) (None, 22, 32, 160) 0 ['batch_normalization_6[0][0]']
lambda (Lambda) (None, 22, 32, 1) 0 ['activation_6[0][0]']
lambda_1 (Lambda) (None, 22, 32, 1) 0 ['activation_6[0][0]']
concatenate (Concatenate) (None, 22, 32, 2) 0 ['lambda[0][0]',
'lambda_1[0][0]']
conv2d_7 (Conv2D) (None, 22, 32, 1) 98 ['concatenate[0][0]']
multiply (Multiply) (None, 22, 32, 160) 0 ['activation_6[0][0]',
'conv2d_7[0][0]']
conv2d_8 (Conv2D) (None, 22, 32, 160) 230560 ['multiply[0][0]']
drop_block2d_7 (DropBlock2D) (None, 22, 32, 160) 0 ['conv2d_8[0][0]']
batch_normalization_7 (BatchNo (None, 22, 32, 160) 640 ['drop_block2d_7[0][0]']
rmalization)
activation_7 (Activation) (None, 22, 32, 160) 0 ['batch_normalization_7[0][0]']
conv2d_transpose (Conv2DTransp (None, 44, 64, 80) 115280 ['activation_7[0][0]']
ose)
concatenate_1 (Concatenate) (None, 44, 64, 160) 0 ['conv2d_transpose[0][0]',
'activation_5[0][0]']
conv2d_9 (Conv2D) (None, 44, 64, 80) 115280 ['concatenate_1[0][0]']
drop_block2d_8 (DropBlock2D) (None, 44, 64, 80) 0 ['conv2d_9[0][0]']
batch_normalization_8 (BatchNo (None, 44, 64, 80) 320 ['drop_block2d_8[0][0]']
rmalization)
activation_8 (Activation) (None, 44, 64, 80) 0 ['batch_normalization_8[0][0]']
conv2d_10 (Conv2D) (None, 44, 64, 80) 57680 ['activation_8[0][0]']
drop_block2d_9 (DropBlock2D) (None, 44, 64, 80) 0 ['conv2d_10[0][0]']
batch_normalization_9 (BatchNo (None, 44, 64, 80) 320 ['drop_block2d_9[0][0]']
rmalization)
activation_9 (Activation) (None, 44, 64, 80) 0 ['batch_normalization_9[0][0]']
conv2d_transpose_1 (Conv2DTran (None, 88, 128, 40) 28840 ['activation_9[0][0]']
spose)
concatenate_2 (Concatenate) (None, 88, 128, 80) 0 ['conv2d_transpose_1[0][0]',
'activation_3[0][0]']
conv2d_11 (Conv2D) (None, 88, 128, 40) 28840 ['concatenate_2[0][0]']
drop_block2d_10 (DropBlock2D) (None, 88, 128, 40) 0 ['conv2d_11[0][0]']
batch_normalization_10 (BatchN (None, 88, 128, 40) 160 ['drop_block2d_10[0][0]']
ormalization)
activation_10 (Activation) (None, 88, 128, 40) 0 ['batch_normalization_10[0][0]']
conv2d_12 (Conv2D) (None, 88, 128, 40) 14440 ['activation_10[0][0]']
drop_block2d_11 (DropBlock2D) (None, 88, 128, 40) 0 ['conv2d_12[0][0]']
batch_normalization_11 (BatchN (None, 88, 128, 40) 160 ['drop_block2d_11[0][0]']
ormalization)
activation_11 (Activation) (None, 88, 128, 40) 0 ['batch_normalization_11[0][0]']
conv2d_transpose_2 (Conv2DTran (None, 176, 256, 20 7220 ['activation_11[0][0]']
spose) )
concatenate_3 (Concatenate) (None, 176, 256, 40 0 ['conv2d_transpose_2[0][0]',
) 'activation_1[0][0]']
conv2d_13 (Conv2D) (None, 176, 256, 20 7220 ['concatenate_3[0][0]']
)
drop_block2d_12 (DropBlock2D) (None, 176, 256, 20 0 ['conv2d_13[0][0]']
)
batch_normalization_12 (BatchN (None, 176, 256, 20 80 ['drop_block2d_12[0][0]']
ormalization) )
activation_12 (Activation) (None, 176, 256, 20 0 ['batch_normalization_12[0][0]']
)
conv2d_14 (Conv2D) (None, 176, 256, 20 3620 ['activation_12[0][0]']
)
drop_block2d_13 (DropBlock2D) (None, 176, 256, 20 0 ['conv2d_14[0][0]']
)
batch_normalization_13 (BatchN (None, 176, 256, 20 80 ['drop_block2d_13[0][0]']
ormalization) )
activation_13 (Activation) (None, 176, 256, 20 0 ['batch_normalization_13[0][0]']
)
conv2d_15 (Conv2D) (None, 176, 256, 1) 21 ['activation_13[0][0]']
activation_14 (Activation) (None, 176, 256, 1) 0 ['conv2d_15[0][0]']
==================================================================================================
Total params: 840,039
Trainable params: 838,279
Non-trainable params: 1,760
__________________________________________________________________________________________________
# photos with 500x scale
plot_train_val(best_model, 1, 1)
# photos with 1000x scale
plot_train_val(best_model, 16, 5)
# plotting histogram of a valdiation picture
idx = 1
pred = best_model.predict(np.expand_dims(X_val[idx], 0))
pred = np.squeeze(pred)
gt_pixels = y_val[idx]
gt_pixels = np.squeeze(gt_pixels)
border = pred[gt_pixels == 1]
grain = pred[gt_pixels == 0]
plt.hist(border, bins=20, density=False, label='true borders', histtype='step')
plt.hist(grain, bins=20, density=False, label='true grains', histtype='step')
plt.title("histogram of border probability")
plt.ylabel("number of pixels in bin")
plt.yscale('log')
plt.grid()
plt.legend()
plt.show()
The histograms shows good separation between border and grain pixels. The accepted threshold of 0.5 is good, reasonable changes of the threshold will not yield very different results. Looking at the histogram only, one could notice the classifficator performance is quite poor. However, when the spatial acceptance is added from the picture above, the results look quite promising. It appears that often FP and FN pixel borders are running in parallel - the shift could be due to errors in the ground truth. Some FP are actually quite questionable for the human and some people would classify them as borders.
# view the distance maps
idx = 1
y_pred = best_model.predict(np.expand_dims(X_val[idx], 0))[0]
y_true = y_val[idx]
y_true_inv = (y_true == 0)
y_true_inv = tf.cast(y_true_inv, tf.uint8)
y_true_distance_map = tfa.image.euclidean_dist_transform(y_true_inv)
y_pred_inv = (y_pred < 0.5)
y_pred_inv = tf.cast(y_pred_inv, tf.uint8)
y_pred_distance_map = tfa.image.euclidean_dist_transform(y_pred_inv)
error_euc_distances = tf.math.abs(tf.math.subtract(y_pred_distance_map, y_true_distance_map))
error_euc_distances = tf.squeeze(error_euc_distances).numpy()
plt.figure(figsize=(16, 7))
plt.subplot(1, 2, 1)
im = plt.imshow(y_true_distance_map, cmap='jet', vmax=40)
plt.colorbar(im, shrink=0.8)
plt.title("True euclidean distances")
plt.subplot(1, 2, 2)
im = plt.imshow(y_pred_distance_map, cmap='jet', vmax=40)
plt.colorbar(im, shrink=0.8)
plt.title("Predicted euclidean distances")
plt.show()
plt.figure(figsize=(12, 7))
im = plt.imshow(error_euc_distances, cmap='jet')
plt.colorbar(im)
plt.title("Absolute error in euclidean distances")
plt.show()
On the picture above, the zones with biggest error are in red.
# model evaluation on train set
best_model.evaluate(X_train, y_train, batch_size=8)
4/4 [==============================] - 1s 58ms/step - loss: 0.2292 - mae_euclidean: 0.7226 - jacard_coef: 0.6282
[0.22923870384693146, 0.7226119041442871, 0.6281639337539673]
# model evaluation on validation set
best_model.evaluate(X_val, y_val, batch_size=8)
1/1 [==============================] - 0s 62ms/step - loss: 0.2979 - mae_euclidean: 1.2925 - jacard_coef: 0.5409
[0.2979312539100647, 1.292516827583313, 0.540920078754425]
# model evaluation on test set
best_model.evaluate(X_test, y_test, batch_size=8)
1/1 [==============================] - 0s 328ms/step - loss: 0.3015 - mae_euclidean: 1.1366 - jacard_coef: 0.5367
[0.3015058636665344, 1.1366015672683716, 0.5366954803466797]
The model performance is similar on the test and the validation dataset.
# analyzing all test images (original and augmented by median filter)
plt.figure(figsize=(16, 12))
for idx in range(X_test.shape[0]):
im = visual_eval(model=best_model, test_img=X_test[idx], mask_expected=y_test[idx], show=False)
plt.subplot(2, 2, idx+1)
plt.imshow(im)
plt.title(f"Test image {idx}")
plt.tight_layout()
plt.show()
Probably the FN pixels could be improved by adding more training data.
(no manual labeling available)
# functions needed to analyze full picture 1280x880 pixels
def create_patch_tensor(image):
"""
prepares patches from image.
the output tensor is with shape [25, 176, 256, 1] - 25 patches with resolution 176x256
"""
im = np.expand_dims([image], -1)
p = tf.image.extract_patches(images=im,
sizes=[1, 176, 256, 1],
strides=[1, 176, 256, 1],
rates=[1, 1, 1, 1],
padding='VALID') # small size of the image is lost
n, r, c = p.shape[0], p.shape[1], p.shape[2]
if r != 5 or c !=5:
print(f"The function is intended for input shape of 880-893x1280 but shape {image.shape} was given")
p = tf.reshape(p, [n*r*c, 176, 256, 1])
return p
def get_border_mask_and_preview(img, model=best_model, thresh=0.50):
assert (img.shape == (880, 1280))
img = img_as_float32(img)
img_p = create_patch_tensor(img)
total_preview = np.zeros(tf.repeat(img_p, 3, axis=-1).shape)
total_mask = np.zeros(tf.squeeze(img_p).shape, dtype=np.uint8)
for i in range(5*5):
patch = img_p[i]
patch = img_as_float32(patch)
preview = patch.copy()
preview = tf.repeat(preview, 3, -1).numpy()
pred = model.predict(np.expand_dims(patch, 0))
if thresh !=0.5:
mask = np.zeros(patch.shape[0:2])
mask[np.squeeze(pred) > thresh] = 1
preview[mask == 1] = (0.5, 0.5, 0.0)
mask = np.zeros(patch.shape[0:2])
mask[np.squeeze(pred) > 0.50] = 1
preview[mask == 1] = (1.0, 0.0, 0.0)
total_mask[i] = mask
total_preview[i] = preview
total_mask = np.reshape(total_mask, (5, 5, IMG_HEIGHT, IMG_WIDTH))
total_mask = np.hstack(np.hstack(total_mask))
total_preview = np.reshape(total_preview, (5, 5, IMG_HEIGHT, IMG_WIDTH, 3))
total_preview = np.hstack(np.hstack(total_preview))
return total_mask, total_preview
def get_grain_prop_and_preview(img, mask_boundaries):
"""
img - input image, grayscale
mask_boundaries - mask for the boundaries
returns:
- grain statistics
- colored preview of the grains
uses waterhed algorithm
"""
if img.ndim != 2:
raise ValueError("Grayscale image expected")
if img.dtype != np.float64 and img.dtype != np.float32:
raise ValueError("Image must be float32 or float64!")
if img.dtype == np.float64:
img = img_as_float32(img)
# obtainning distance map and peaks
mask_boundaries = mask_boundaries.astype(bool)
mask_grains = ~mask_boundaries
distance_map = ndimage.distance_transform_edt(mask_grains)
# plt.imshow(mask_grains, cmap='jet')
# return 0, 0
peak_idx = peak_local_max(distance_map, min_distance=5, exclude_border=True)
# filter unique peaks only, that are not connected to other peaks
dmap = distance_map.copy() # dmap will be modified by flood fill
unique_idx = []
for peak in peak_idx: # start from biggest peaks to the smallest
max_dist = dmap[peak[0], peak[1]] # the value of the peak at x, y
if max_dist > 0:
# this peak has not been eaten so far
unique_idx.append(peak)
dmap = flood_fill(dmap, (peak[0], peak[1]), new_value=0, tolerance=0.6*max_dist) #tolerance=0.3*max_dist
unique_idx = np.array(unique_idx)
# mask for major peaks that survived the flood_fill
peak_mask = np.zeros_like(distance_map, dtype=bool)
peak_mask[tuple(unique_idx.T)] = True
cross = np.array([[0, 1, 0],
[1, 1, 1],
[0, 1, 0],])
peak_preview = binary_dilation(peak_mask, cross)
# mask for all peaks
peak_mask_all = np.zeros_like(distance_map, dtype=bool)
peak_mask_all[tuple(peak_idx.T)] = True
peak_preview_all = binary_dilation(peak_mask_all, cross)
# Perform connected component analysis then apply Watershed
markers = ndimage.label(peak_mask, structure=np.ones((3, 3)))[0]
labels = watershed(-distance_map, markers, mask=mask_grains, watershed_line=True)
labels = clear_border(labels)
peak_preview[labels == 0] = 0 # remove the center annotation
# create statistics
regions = measure.regionprops(labels)
grain_prop = {'area': list(),
'd_max' : list()}
for prop in regions:
area = prop.filled_area
perimeter = prop.perimeter
grain_prop['area'].append(area + 0.5 * perimeter)
grain_prop['d_max'].append(prop.feret_diameter_max)
# create colored picture
x = np.linspace(0.0, 1, 20)
colors = plt.get_cmap('tab20')(x)[:,:3]
colors = np.delete(colors, [14, 15], 0) # remove grays
grain_preview = color.label2rgb(labels, image=img, colors=colors, bg_label=0, alpha=0.2)
# grain_preview[peak_preview_all] = (0.5, 0.5, 0.5) # show minor centers of grains
grain_preview[peak_preview] = (1, 1, 1) # show centers of grains
grain_preview[~mask_grains] = (0.3, 0, 0) # show borders from the ML
return grain_prop, grain_preview
def detec_grains_in_folder(path, model=best_model):
"""
path should be without path separators
makes preview and statistics for a folder
"""
path_out = path
df_folder = pd.DataFrame()
for image in os.listdir(path):
if image.split(".")[-1] == 'jpg':
img_path = path + "\\" + image
print(f"Working on {img_path}")
# reading and processing the image
img = io.imread(img_path, as_gray=True)
img = img_as_float32(img)
mask, mask_preview = get_border_mask_and_preview(img, model)
grain_prop, colored_preview = get_grain_prop_and_preview(img, mask)
# saving the ML borders
vis = mask_preview.copy()
vis[vis >1] = 1
vis = img_as_ubyte(vis)
new_filename = img_path.replace(".jpg", "_borders.tif")
io.imsave(new_filename, vis)
# saving colored grains
vis = colored_preview.copy()
vis[vis >1] = 1
vis = img_as_ubyte(vis)
new_filename = img_path.replace(".jpg", "_grains.tif")
io.imsave(new_filename, vis)
df_image = pd.DataFrame(grain_prop)
df_image.area = df_image.area * 1 # number of pixels
df_image.d_max = df_image.d_max * 1 # pixels
df_image["file"] = image
df_folder = pd.concat([df_folder, df_image], axis=0, ignore_index=True)
df_folder.to_csv(path_out + "\\" + "grain_summary.csv")
return df_folder
# creating summaries in each folder
# grain_stats = detec_grains_in_folder(r"datasets\production") # already done
The processed images are stored in the original folder.
img = io.imread(r"datasets\production\production_1_borders.tif")
plt.figure(figsize=(14,10))
plt.imshow(img)
plt.title('example border preview')
plt.show()
# removing image footer and renaming
# path = r"datasets\production"
# i = 1
# for image in os.listdir(path):
# if image.split(".")[-1] == 'jpg':
# img_path = path + "\\" + image
# img = io.imread(img_path, as_gray=True)
# img = img_as_ubyte(img)
# img = img[2:882,:]
# io.imsave(path + f"\\production_{i}.jpg", img)
# i += 1
Comparison to the ML approach using XGBoost https://github.com/Chehlarov/Machine-Learning/tree/main/00%20-%20project
Notes on the dataset:
Lessons learned:
Ideas for improvement:
Conclusions: